Giter VIP home page Giter VIP logo

lfpykit's Introduction

LFPy

Summary

LFPy is a Python module for calculation of extracellular potentials from multicompartment neuron models. It relies on the NEURON simulator (http://www.neuron.yale.edu/neuron) and uses the Python interface (http://www.frontiersin.org/neuroinformatics/10.3389/neuro.11.001.2009/abstract) it provides.

Latest changes

Just updated LFPy? Please check the latest release notes: https://github.com/LFPy/LFPy/releases

Usage

A brief video tutorial on LFPy is available here: https://youtu.be/gCQkyTHZ1lw

LFPy is preinstalled at the EBRAINS collaboratory, and you can test LFPy online without installation, by clicking this button:

Note that you might need to be logged into an EBRAINS account for the link to work. To get a free EBRAINS account, sign up here: https://www.ebrains.eu/page/sign-up

A basic simulation of extracellular potentials of a multicompartment neuron model set up with LFPy:

>>> # import modules
>>> import LFPy
>>> from LFPy import Cell, Synapse, LineSourcePotential
>>> import numpy as np
>>> import matplotlib.pyplot as plt
>>> # create Cell
>>> cell = Cell(morphology=''.join(LFPy.__path__ +
>>>                                ['/test/ball_and_sticks.hoc']),
>>>             passive=True,  # NEURON 'pas' mechanism
>>>             tstop=100,  # ms
>>>            )
>>> # create Synapse
>>> synapse = Synapse(cell=cell,
>>>                   idx=cell.get_idx("soma[0]"),  # soma segment index
>>>                   syntype='Exp2Syn',  # two-exponential synapse
>>>                   weight=0.005,  # max conductance (uS)
>>>                   e=0,  # reversal potential (mV)
>>>                   tau1=0.5,  # rise time constant
>>>                   tau2=5.,  # decay time constant
>>>                   record_current=True,  # record synapse current
>>>                  )
>>> synapse.set_spike_times(np.array([20., 40]))  # set activation times
>>> # create extracellular predictor
>>> lsp = LineSourcePotential(cell=cell,
>>>                           x=np.zeros(11) + 10,  # x-coordinates of contacts (µm)
>>>                           y=np.zeros(11),  # y-coordinates
>>>                           z=np.arange(11)*20,  # z-coordinates
>>>                           sigma=0.3,  # extracellular conductivity (S/m)
>>>                          )
>>> # execute simulation
>>> cell.simulate(probes=[lsp])  # compute measurements at run time
>>> # plot results
>>> fig, axes = plt.subplots(3, 1, sharex=True, figsize=(12, 8))
>>> axes[0].plot(cell.tvec, synapse.i)
>>> axes[0].set_ylabel('i_syn (nA)')
>>> axes[1].plot(cell.tvec, cell.somav)
>>> axes[1].set_ylabel('V_soma (nA)')
>>> axes[2].pcolormesh(cell.tvec, lsp.z, lsp.data, shading='auto')
>>> axes[2].set_ylabel('z (µm)')
>>> axes[2].set_xlabel('t (ms)')

Code status

PyPI version flake8 lint Python application Coverage Status Documentation Status Binder DOI

Conda-forge status

Conda Recipe Conda Downloads Conda Version Conda Platforms

Information

LFPy provides a set of easy-to-use Python classes for setting up your model, running your simulations and calculating the extracellular potentials arising from activity in your model neuron. If you have a model working in NEURON (www.neuron.yale.edu) already, it is likely that it can be adapted to work with LFPy.

The extracellular potentials are calculated from transmembrane currents in multicompartment neuron models using the line-source method (Holt & Koch, J Comp Neurosci 1999), but a simpler point-source method is also available. The calculations assume that the neuron are surrounded by an infinite extracellular medium with homogeneous and frequency independent conductivity, and compartments are assumed to be at least at a minimal distance from the electrode (which can be specified by the user). For more information on the biophysics underlying the numerical framework used see this coming book chapter:

The first release of LFPy (v1.x) was mainly designed for simulation extracellular potentials of single neurons, described in our paper on the package in Frontiers in Neuroinformatics entitled "LFPy: A tool for biophysical simulation of extracellular potentials generated by detailed model neurons". The article can be found at https://dx.doi.org/10.3389/fninf.2013.00041. Since version 2 (LFPy v2.x), the tool also facilitates simulations of extracellular potentials and current dipole moment from ongoing activity in recurrently connected networks of multicompartment neurons, prediction of EEG scalp surface potentials, MEG scalp surface magnetic fields, as described in the publication "Multimodal modeling of neural network activity: computing LFP, ECoG, EEG and MEG signals with LFPy2.0" by Espen Hagen, Solveig Naess, Torbjoern V Ness, Gaute T Einevoll, found at https://dx.doi.org/10.3389/fninf.2018.00092.

Citing LFPy

  • LFPy v2.x: Hagen E, Næss S, Ness TV and Einevoll GT (2018) Multimodal Modeling of Neural Network Activity: Computing LFP, ECoG, EEG, and MEG Signals With LFPy 2.0. Front. Neuroinform. 12:92. doi: 10.3389/fninf.2018.00092. https://dx.doi.org/10.3389/fninf.2018.00092

  • LFPy v1.x: Linden H, Hagen E, Leski S, Norheim ES, Pettersen KH and Einevoll GT (2013). LFPy: A tool for biophysical simulation of extracellular potentials generated by detailed model neurons. Front. Neuroinform. 7:41. doi: 10.3389/fninf.2013.00041. https://dx.doi.org/10.3389/fninf.2013.00041

LFPy was developed in the Computational Neuroscience Group, Department of Mathemathical Sciences and Technology (http://www.nmbu.no/imt), at the Norwegian University of Life Sciences (http://www.nmbu.no), in collaboration with the Laboratory of Neuroinformatics (http://www.nencki.gov.pl/en/laboratory-of-neuroinformatics), Nencki Institute of Experimental Biology (http://www.nencki.gov.pl), Warsaw, Poland. The effort was supported by International Neuroinformatics Coordinating Facility (http://incf.org), the Research Council of Norway (http://www.forskningsradet.no/english) (eScience, NevroNor), EU-FP7 (BrainScaleS, http://www.brainscales.org), the European Union Horizon 2020 Framework Programme for Research and Innovation under Specific Grant Agreement No. 785907 and No. 945539 [Human Brain Project (HBP) SGA2, SGA3 and EBRAINS].

For updated information on LFPy and online documentation, see the LFPy homepage (http://lfpy.readthedocs.io).

Tutorial slides on LFPy

Related projects

LFPy has been used extensively in ongoing and published work, and may be a required dependency by the publicly available Python modules:

Requirements

Dependencies should normally be automatically installed. For manual preinstallation of dependencies, the following packages are needed:

Installation

There are few options to install LFPy:

  1. From the Python Package Index with only local access using pip:

     pip install LFPy --user
    

    as sudoer (in general not recommended as system Python files may be overwritten):

     sudo pip install LFPy
    

    Upgrading LFPy from the Python package index (without attempts at upgrading dependencies):

     pip install --upgrade --no-deps LFPy --user
    

    LFPy release candidates can be installed as:

     pip install --pre --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple LFPy --user
    
  2. From the Python Package Index with only local access using easy_install:

     easy_install --user LFPy
    

    As sudoer:

     sudo easy_install LFPy
    
  3. From source:

     tar -xzf LFPy-x.x.tar.gz
     cd LFPy-x.x
     (sudo) python setup.py develop (--user)
    
  4. Development version from the GitHub repository:

     git clone https://github.com/LFPy/LFPy.git
     cd LFPy
     (sudo) pip install -r requirements.txt (--user) # install dependencies
     (sudo) python setup.py develop (--user)
    
  5. Anaconda Python (https://www.anaconda.com, macos/linux only):

    Add the conda-forge (https://conda-forge.org) as channel:

     conda config --add channels conda-forge
     conda config --set channel_priority strict  # suggested
    

    Create a new conda environment with LFPy and activate it:

     conda create -n lfpy python=3 pip lfpy  # creates new Python 3.x conda environment named lfpy with pip and LFPy and their dependencies
     conda activate lfpy  # activate the lfpy environment
     python -c "import LFPy; LFPy.run_tests()"  # check that installation is working
    

    LFPy can also be installed in existing conda environments if the dependency tree is solvable:

     conda activate <environment>
     conda install lfpy  # installs LFPy and its dependencies in the current conda environment
    

Uninstall

To remove installed LFPy files it should suffice to issue (repeat until no more LFPy files are found):

(sudo) pip uninstall LFPy

In case LFPy was installed using conda in an environment, it can be uninstalled by issuing:

conda uninstall lfpy

Docker

We provide a Docker (https://www.docker.com) container recipe file with LFPy. To get started, install Docker and issue either:

# build Dockerfile from GitHub
$ docker build -t lfpy https://raw.githubusercontent.com/LFPy/LFPy/master/Dockerfile
$ docker run -it -p 5000:5000 lfpy:latest

or

# build local Dockerfile (obtained by cloning repo, checkout branch etc.)
$ docker build -t lfpy - < Dockerfile
$ docker run -it -p 5000:5000 lfpy:latest

If the docker file should fail for some reason it is possible to store the build log and avoid build caches by issuing

docker build --no-cache --progress=plain -t lfpy - < Dockerfile 2>&1 | tee lfpy.log

If the build is successful, the --mount option can be used to mount a folder on the host to a target folder as:

docker run --mount type=bind,source="$(pwd)",target=/opt -it -p 5000:5000 lfpy

which mounts the present working dirctory ($(pwd)) to the /opt directory of the container. Try mounting the LFPy source directory for example (by setting source="<path-to-LFPy>"). Various LFPy example files can then be found in the folder /opt/LFPy/examples/ when the container is running.

Jupyter notebook servers running from within the container can be accessed after invoking them by issuing:

cd /opt/LFPy/examples/
jupyter-notebook --ip 0.0.0.0 --port=5000 --no-browser --allow-root

and opening the resulting URL in a browser on the host computer, similar to: http://127.0.0.1:5000/?token=dcf8f859f859740fc858c568bdd5b015e0cf15bfc2c5b0c1

HTML Documentation

To generate the html documentation also hosted at https://lfpy.rtfd.io using Sphinx, issue from the LFPy source code directory:

cd doc
make html

The main html file is in _build/html/index.html. m2r2, Numpydoc and the Sphinx ReadTheDocs theme may be needed:

pip install m2r2 --user
pip install numpydoc --user
pip install sphinx-rtd-theme --user

Physical units in LFPy

Physical units follow the NEURON conventions found here. The units in LFPy for given quantities are:

Quantity Symbol Unit
Spatial dimensions x,y,z,d [μm]
Potential v, Phi, Φ [mV]
Reversal potential E [mV]
Current i [nA]
Membrane capacitance c_m [μF/cm2]
Conductance g [S/cm2]
Synaptic conductance g [µS]
Extracellular conductivity sigma, σ [S/m]
Current dipole moment P [nA µm]
Magnetic field H [nA/µm]
Magnetic permeability µ, mu [T m/A]
Current Source Density CSD [nA/µm3]

Note: resistance, conductance and capacitance are usually specific values, i.e per membrane area (lowercase r_m, g, c_m) Depending on the mechanism files, some may use different units altogether, but this should be taken care of internally by NEURON.

lfpykit's People

Contributors

espenhgn avatar torbjone avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

Forkers

torbjone espenhgn

lfpykit's Issues

Support Python 3.9

Python 3.9 is now official and should be supported via travisCI, PyPI builds etc.

rename `get_response_matrix()` method `get_transformation_matrix`

Up for discussion; each forward-model like class should have a public method get_response_matrix().
Should this be renamed or fine as is? Some libraries use get_transfer_resistance (or similar), but that doesn't make sense for class CurrentDipoleMoment for instance.

Rename module

Rename module to something nicer before initial release. Rename repository accordingly.

conda-forge recipe

Optional as this project should not rely on difficult dependencies; but when this project reaches a more mature state we should consider a conda-forge recipe.

Variable diameter segments (Arbor)

In Arbor, the discretization of dynamics into compartments (CVs) doesn't line up with the discretization of the morphology into piecewise-linear segments.

Consequently, any one sample of a trans-membrane current on the cell corresponds to the current over multiple cylinders (or truncated conical frustra as the case may be). We can apportion the current across each segment proportionally by area, and sum against the response for each of these segments, but it would be more efficient to do it one step, and have the response for a CV current be the area-weighted sum of the response from each segment.

This particular case needs to be incorporated. At the moment GeometryCell assumes input x,y,z dimensions (n_seg, n_points_along_seg) but constant diameters per segment (parameter d with shape (n_seg, )).

The same feature could be used with NEURON/LFPy models as well in case the full pt3d information present in the morphology .swc/.hoc is read, although the physical interpretation may not be identical with that of Arbor. For instance we should assess whether or not the assumption of constant current density holds in a variable diameter part of an electric segment.

update README.md

  • simple example showing usage

  • description of intended use

  • installation

  • link to resources (documentation, literature etc.)

Adapt class methods that are LFPy specific

Some class methods like FourSphereVolumeConductor. calc_potential_from_multi_dipoles() uses LFPy.Cell. calc_potential_from_multi_dipoles() and similar.

My suggestion is that we set up subclasses in LFPy which adds these multi-dipole methods only to LFPy and remove the corresponding methods here. Or should we keep calc_potential_from_multi_dipoles()?

Volumetric CSD

Is your feature request related to a problem? Please describe.

Describe the solution you'd like
A class VolumetricCurrentSourceDensity which facilitates computing the ground-truth CSD in volumetric bins with edges defined by X, Y, Z = np.mgrid[...] or X, Y, Z = np.meshgrid(x, y, z)

Describe alternatives you've considered

Additional context

uniform way to use current dipole forward models.

add method that returns linear response matrix M between position-dependent measurement Y and dipole moment P as
Y = MP
to FourSphereVolumeConductor, InfiniteVolumeConductor.

The different classes must assume that shape of the current dipole moment P is (3, n_tsteps). The current implementations assumes the opposite (i.e., the transpose)!

Make sure /lfpykit/test/fem_mix_dip.npz is copied during install

Describe the bug
The file /lfpykit/test/fem_mix_dip.npz is not copied when installing as python setup.py install (--user)

Test output

__________________________________________________________________ testFourSphereVolumeConductor.test_calc_potential01 __________________________________________________________________

self = <test_eegmegcalc.testFourSphereVolumeConductor testMethod=test_calc_potential01>

    def test_calc_potential01(self):
        '''test comparison between analytical 4S-model and FEM simulation'''
        # load data
        fem_sim = np.load(
            os.path.join(
                lfpykit.__path__[0],
                'tests',
>               'fem_mix_dip.npz'))

lfpykit/tests/test_eegmegcalc.py:410: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

file = '/p/home/jusers/hagen1/jureca/.local/lib/python3.6/site-packages/LFPykit-0.1a2-py3.6.egg/lfpykit/tests/fem_mix_dip.npz', mmap_mode = None, allow_pickle = True, fix_imports = True
encoding = 'ASCII'

    def load(file, mmap_mode=None, allow_pickle=True, fix_imports=True,
             encoding='ASCII'):
        """
        Load arrays or pickled objects from ``.npy``, ``.npz`` or pickled files.
    
        Parameters
        ----------
        file : file-like object, string, or pathlib.Path
            The file to read. File-like objects must support the
            ``seek()`` and ``read()`` methods. Pickled files require that the
            file-like object support the ``readline()`` method as well.
        mmap_mode : {None, 'r+', 'r', 'w+', 'c'}, optional
            If not None, then memory-map the file, using the given mode (see
            `numpy.memmap` for a detailed description of the modes).  A
            memory-mapped array is kept on disk. However, it can be accessed
            and sliced like any ndarray.  Memory mapping is especially useful
            for accessing small fragments of large files without reading the
            entire file into memory.
        allow_pickle : bool, optional
            Allow loading pickled object arrays stored in npy files. Reasons for
            disallowing pickles include security, as loading pickled data can
            execute arbitrary code. If pickles are disallowed, loading object
            arrays will fail.
            Default: True
        fix_imports : bool, optional
            Only useful when loading Python 2 generated pickled files on Python 3,
            which includes npy/npz files containing object arrays. If `fix_imports`
            is True, pickle will try to map the old Python 2 names to the new names
            used in Python 3.
        encoding : str, optional
            What encoding to use when reading Python 2 strings. Only useful when
            loading Python 2 generated pickled files in Python 3, which includes
            npy/npz files containing object arrays. Values other than 'latin1',
            'ASCII', and 'bytes' are not allowed, as they can corrupt numerical
            data. Default: 'ASCII'
    
        Returns
        -------
        result : array, tuple, dict, etc.
            Data stored in the file. For ``.npz`` files, the returned instance
            of NpzFile class must be closed to avoid leaking file descriptors.
    
        Raises
        ------
        IOError
            If the input file does not exist or cannot be read.
        ValueError
            The file contains an object array, but allow_pickle=False given.
    
        See Also
        --------
        save, savez, savez_compressed, loadtxt
        memmap : Create a memory-map to an array stored in a file on disk.
        lib.format.open_memmap : Create or load a memory-mapped ``.npy`` file.
    
        Notes
        -----
        - If the file contains pickle data, then whatever object is stored
          in the pickle is returned.
        - If the file is a ``.npy`` file, then a single array is returned.
        - If the file is a ``.npz`` file, then a dictionary-like object is
          returned, containing ``{filename: array}`` key-value pairs, one for
          each file in the archive.
        - If the file is a ``.npz`` file, the returned value supports the
          context manager protocol in a similar fashion to the open function::
    
            with load('foo.npz') as data:
                a = data['a']
    
          The underlying file descriptor is closed when exiting the 'with'
          block.
    
        Examples
        --------
        Store data to disk, and load it again:
    
        >>> np.save('/tmp/123', np.array([[1, 2, 3], [4, 5, 6]]))
        >>> np.load('/tmp/123.npy')
        array([[1, 2, 3],
               [4, 5, 6]])
    
        Store compressed data to disk, and load it again:
    
        >>> a=np.array([[1, 2, 3], [4, 5, 6]])
        >>> b=np.array([1, 2])
        >>> np.savez('/tmp/123.npz', a=a, b=b)
        >>> data = np.load('/tmp/123.npz')
        >>> data['a']
        array([[1, 2, 3],
               [4, 5, 6]])
        >>> data['b']
        array([1, 2])
        >>> data.close()
    
        Mem-map the stored array, and then access the second row
        directly from disk:
    
        >>> X = np.load('/tmp/123.npy', mmap_mode='r')
        >>> X[1, :]
        memmap([4, 5, 6])
    
        """
        own_fid = False
        if isinstance(file, basestring):
>           fid = open(file, "rb")
E           NotADirectoryError: [Errno 20] Not a directory: '/p/home/jusers/hagen1/jureca/.local/lib/python3.6/site-packages/LFPykit-0.1a2-py3.6.egg/lfpykit/tests/fem_mix_dip.npz'

/usr/local/software/jureca/Stages/2018b/software/SciPy-Stack/2018b-gcccoremkl-7.3.0-2019.0.117-Python-3.6.6/lib/python3.6/site-packages/numpy-1.15.2-py3.6-linux-x86_64.egg/numpy/lib/npyio.py:384: NotADirectoryError

Package test scripts

Describe the bug
lfpykit/tests/*.py aren't included in setuptools package data. They should.

Use NEURON from PyPI for tests

The conda-forge release of NEURON is outdated vs. the official NEURON release on PyPI. Unit tests should use the PyPI version.

forward models

Port forward-model implementations from LFPy. Convert to classes if not already a class. Class names may be up for discussion.

  • lfpcalc.calc_lfp_*()

  • class PointSourcePotential

  • class LineSourcePotential

  • class RecExtElectrode

  • class RecMEAElectrode

  • class CurrentDipoleMoment

  • class MEG

  • class FourSphereVolumeConductor for current dipole moments

  • class OneSphereVolumeConductor for current dipole moments

  • class InfiniteVolumeConductor for current dipole moments

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.