Giter VIP home page Giter VIP logo

neuron_reduce's Introduction

Introduction

Neuron_Reduce provides an analytical method for reducing neuron model complexity. It enables the mapping of synapses and active ion channels to a computationally simpler model while accelerating simulation speed by up to 200-fold for inputs consisting of thousands of dendritic synapses.

Full details are available in the accompanied paper: An efficient analytical reduction of detailed nonlinear neuron models. Nat. Commun., 11 (2020), p. 288 - https://www.nature.com/articles/s41467-019-13932-6

Installation

pip install --user neuron_reduce

Quick Start

The following code show the main function that is used to reduce a complex cell.

complex_cell  # The model cell
synapses_list # A list of all synapse on this cell
netcon_list   # A list of all netcons for the synapses on the cell

import neuron_reduce
reduced_cell, synapses_list, netcons_list =  neuron_reduce.subtree_reductor(complex_cell, synapses_list, netcons_list)

Detailed example

Copy example folder from github

git clone https://github.com/orena1/neuron_reduce.git

Go to example folder

cd neuron_reduce
cd example
nrnivmodl mod #compile the mod files

Open python and run the following code

from __future__ import division
from neuron import gui,h
import numpy as np
import neuron_reduce
import time
import matplotlib.pyplot as plt



#Create a L5_PC model
h.load_file('L5PCbiophys3.hoc')
h.load_file("import3d.hoc")
h.load_file('L5PCtemplate.hoc')
complex_cell = h.L5PCtemplate('cell1.asc')
h.celsius = 37
h.v_init = complex_cell.soma[0].e_pas


#Add synapses to the model
synapses_list, netstims_list, netcons_list, randoms_list = [], [], [] ,[]

all_segments = [i for j in map(list,list(complex_cell.apical)) for i in j] + [i for j in map(list,list(complex_cell.basal)) for i in j]
len_per_segment = np.array([seg.sec.L/seg.sec.nseg for seg in all_segments])
rnd = np.random.RandomState(10)
for i in range(10000):
    seg_for_synapse = rnd.choice(all_segments,   p=len_per_segment/sum(len_per_segment))
    synapses_list.append(h.Exp2Syn(seg_for_synapse))
    if rnd.uniform()<0.85:
        e_syn, tau1, tau2, spike_interval, syn_weight = 0, 0.3, 1.8,  1000/2.5, 0.0016
    else:
        e_syn, tau1, tau2, spike_interval, syn_weight = -86, 1,   8,   1000/15.0, 0.0008
    #set synaptic varibales
    synapses_list[i].e, synapses_list[i].tau1, synapses_list[i].tau2 = e_syn, tau1, tau2
    #set netstim variables
    netstims_list.append(h.NetStim())
    netstims_list[i].interval, netstims_list[i].number, netstims_list[i].start, netstims_list[i].noise = spike_interval, 9e9, 100, 1
    #set random
    randoms_list.append(h.Random())
    randoms_list[i].Random123(i)
    randoms_list[i].negexp(1)
    netstims_list[i].noiseFromRandom(randoms_list[i])       
    #set netcon varibales 
    netcons_list.append(h.NetCon(netstims_list[i], synapses_list[i] ))
    netcons_list[i].delay, netcons_list[i].weight[0] = 0, syn_weight

#Simulate the full neuron for 1 seconds
soma_v = h.Vector()
soma_v.record(complex_cell.soma[0](0.5)._ref_v)

time_v = h.Vector()
time_v.record(h._ref_t)

h.tstop = 1000
st = time.time()
h.run()
print('complex cell simulation time {:.4f}'.format(time.time()-st))
complex_cell_v = list(soma_v)



#apply Neuron_Reduce to simplify the cell
reduced_cell, synapses_list, netcons_list = neuron_reduce.subtree_reductor(complex_cell, synapses_list, netcons_list, reduction_frequency=0, total_segments_manual=-1)
for r in randoms_list:r.seq(1) #reset random


#Running the simulation again but now on the reduced cell
st = time.time()
h.run()
print('reduced cell simulation time {:.4f}'.format(time.time()-st))
reduced_celll_v = list(soma_v)

#plotting the results
plt.figure()

plt.plot(time_v, complex_cell_v, label='complex cell')
plt.plot(time_v, reduced_celll_v,  label='redcued cell')
plt.show()

Citation

O. Amsalem, G. Eyal, N. Rogozinski, M. Gevaert, P. Kumbhar, F. Schürmann, I. Segev. An efficient analytical reduction of detailed nonlinear neuron models. Nat. Commun., 11 (2020), p. 288

neuron_reduce's People

Contributors

asanin-epfl avatar mgeplf avatar orena1 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

neuron_reduce's Issues

Question about model.hoc

Do I need to use geom_nseg(), biophys() of the default model.hoc in my custom.hoc If I use the latter as model_template?

neuron_reduce.subtree_reductor(...,model_filename='custom.hoc')

Question about `total_segments_manual` argument

Can you please clarify about total_segments_manual argument of subtree_reductor. I need to better understand it.
I have a cell with 11 synapses. During the reduction in method merge_and_add_synapses at this line they receive the next positions (offset + section segment):

  0.032426184821815696 model[0].dend[0]
  0.032426184821815696 model[0].dend[0]
  0.012948382704962572 model[0].dend[1]
  0.10025492308704999 model[0].dend[2]
  0.10025492308704999 model[0].dend[2]
  0.10224592355734784 model[0].dend[3]
  0.059093705516716 model[0].dend[5]
  0.059093705516716 model[0].dend[5]
  0.09401371780768385 model[0].dend[5]
  0.4519636798941064 model[0].dend[5]
  0.3752090096168105 model[0].dend[5]

Later when I access those synapses in NEURON they have different positions because segmentation of sections is very low and all these very precise offsets like 0.032426184821815696 turn into very different offsets like 0.1. What would you suggest in this situation in order to preserve original offsets as much as possible? Below I put how changed the above positions

0.1 model[0].dend[0]
0.1 model[0].dend[0]
0.1 model[0].dend[1]
0.1 model[0].dend[2]
0.1 model[0].dend[2]
0.16666 model[0].dend[3]
0.1 model[0].dend[5]
0.1 model[0].dend[5]
0.1 model[0].dend[5]
0.5 model[0].dend[5]
0.3 model[0].dend[5]

Saving the reduced model

Hello,

I would like to inquire what are my possibilities with regards to saving the reduced model? Is the only option pickling the output from subtree_reductor and later loading it in python, or is it possible to save the model in a format that can be later opened in NEURON?

Thanks in advance!

Question about reduction of source cells (Outcome synapses)

Hi! I understand this situation is improbable but I think it might exists in practice. An example:

from neuron import h

h.load_file('L5PCtemplate.hoc')
source_cell = h.L5PCtemplate('cell1.asc')
target_cell = h.L5PCtemplate('cell1.asc')
synapse = h.Exp2Syn(target_cell.apical[12](0.5))
netcon = h.NetCon(
    source_cell.axon[6](0.5),
    synapse)

Everything is going to be fine when we reduce target_cell. We get a reduced location for synapse and update its location. The question is how should we proceed if source_cell is reduced? We need to update the source location of netcon. How would you suggest to handle this? Is it an impossible situation in real neuroscience simulations?

Replace `print` statements with `logging`

During circuit reduction the output of neurond_reduce seriously swamps other output. Would it be ok if I replace print statements with logging functionality? I have around 200 KB of such output:

There is no segment to segment copy, it means that some segments in the reduce...

Exception: no child seg nor parent seg, with active channels, was found

Hi! I receive the topic error when do this:

from neuron import gui,h
import neuron_reduce

h.load_file('bAC_327962063.hoc')
complex_cell = h.bAC_327962063(0, '', '02583f52ff47b88961e4216e2972ee8c.swc')
neuron_reduce.subtree_reductor(complex_cell, [], [], reduction_frequency=0)
Traceback (most recent call last):
  File "/sonata_network_reduction/reduce.py", line 21, in <module>
    neuron_reduce.subtree_reductor(complex_cell, [], [], reduction_frequency=0)
  File "/venv/lib/python3.6/site-packages/neuron_reduce/subtree_reductor_func.py", line 871, in subtree_reductor
    mapping_type)
  File "/venv/lib/python3.6/site-packages/neuron_reduce/subtree_reductor_func.py", line 352, in copy_dendritic_mech
    mech_names_per_segment)
  File "/venv/lib/python3.6/site-packages/neuron_reduce/subtree_reductor_func.py", line 387, in handle_orphan_segments
    raise Exception("no child seg nor parent seg, with active channels, was found")
Exception: no child seg nor parent seg, with active channels, was found

Some details. There is no apic SectionList. create_reduced_cell creates 9 basals/sections: {, model[0].dend[0], model[0].dend[1], model[0].dend[2], model[0].dend[3], model[0].dend[4], model[0].dend[5], model[0].dend[6], model[0].dend[7], model[0].dend[8]}. Somehow the last of basals (model[0].dend[8]) is not in reduced_seg_to_original_seg that returns create_seg_to_seg, and because it is a section with a single segment in it => no parent/children to copy mechanisms from => error.

I've sent the files on your email.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.