Giter VIP home page Giter VIP logo

libneuroml's People

Contributors

apdavison avatar arosh avatar baladkb avatar clbarnes avatar jefferis avatar kapilkd13 avatar lebedov avatar mattions avatar mstimberg avatar pgleeson avatar sanjayankur31 avatar unidesigner avatar vellamike avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

libneuroml's Issues

AttributeError: 'Property' object has no attribute 'includes'

Description

While using the -graph option with pynml on a simple network example, pynml crashes with this error.

Versions of software

pyNeuroML v0.5.6: Python utilities for NeuroML2
    libNeuroML v0.2.52
    jNeuroML v0.10.1

How reproducible

Always

Steps to reproduce

  1. Run this python script to generate the NeuroML/LEMS descriptions:
#!/usr/bin/env python3
"""
Create a simple network with two populations.
"""

from neuroml import NeuroMLDocument
from neuroml import Izhikevich2007Cell
from neuroml import Network
from neuroml import ExpOneSynapse
from neuroml import Population
from neuroml import Projection
from neuroml import PulseGenerator
from neuroml import ExplicitInput
from neuroml import Connection
import neuroml.writers as writers
import random
from pyneuroml import pynml
from pyneuroml.lems import LEMSSimulation
import numpy as np


nml_doc = NeuroMLDocument(id="IzNet")

iz0 = Izhikevich2007Cell(
    id="iz2007RS0", v0="-60mV", C="100pF", k="0.7nS_per_mV", vr="-60mV",
    vt="-40mV", vpeak="35mV", a="0.03per_ms", b="-2nS", c="-50.0mV", d="100pA")
nml_doc.izhikevich2007_cells.append(iz0)

syn0 = ExpOneSynapse(id="syn0", gbase="65nS", erev="0mV", tau_decay="3ms")
nml_doc.exp_one_synapses.append(syn0)

net = Network(id="IzNet")
nml_doc.networks.append(net)

size0 = 5
pop0 = Population(id="IzPop0", component=iz0.id, size=size0)
net.populations.append(pop0)

size1 = 5
pop1 = Population(id="IzPop1", component=iz0.id, size=size1)
net.populations.append(pop1)

proj = Projection(id='proj', presynaptic_population=pop0.id,
                  postsynaptic_population=pop1.id, synapse=syn0.id)
net.projections.append(proj)

random.seed(921)
prob_connection = 0.5
count = 0
for pre in range(0, size0):
    pg = PulseGenerator(
        id="pulseGen_%i" % pre, delay="0ms", duration="10000ms",
        amplitude="%f nA" % (0.1 * random.random())
    )
    nml_doc.pulse_generators.append(pg)

    exp_input = ExplicitInput(target="%s[%i]" % (pop0.id, pre), input=pg.id)
    net.explicit_inputs.append(exp_input)

    for post in range(0, size1):
        if random.random() <= prob_connection:
            syn = Connection(id=count,
                             pre_cell_id="../%s[%i]" % (pop0.id, pre),
                             synapse=syn0.id,
                             post_cell_id="../%s[%i]" % (pop1.id, post))
            proj.connections.append(syn)
            count += 1

nml_file = 'izhikevich2007_network.nml'
writers.NeuroMLWriter.write(nml_doc, nml_file)

print("Written network file to: " + nml_file)
pynml.validate_neuroml2(nml_file)

simulation_id = "example_izhikevich2007network_sim"
simulation = LEMSSimulation(sim_id=simulation_id,
                            duration=10000, dt=0.1, simulation_seed=123)
simulation.assign_simulation_target(net.id)
simulation.include_neuroml2_file(nml_file)

simulation.create_event_output_file(
    "pop0", "%s.spikes.dat" % simulation_id, format='ID_TIME'
)

for pre in range(0, size0):
    simulation.add_selection_to_event_output_file(
        "pop0", pre, 'IzPop0/{}'.format(pre), 'spike')

lems_simulation_file = simulation.save_to_file()

pynml.run_lems_with_jneuroml_neuron(
    lems_simulation_file, max_memory="20G", nogui=True, plot=False
)

# Load the data from the file and plot the spike times
# using the pynml generate_plot utility function.
data_array = np.loadtxt("%s.spikes.dat" % simulation_id)
pynml.generate_plot(
    [data_array[:, 1]], [data_array[:, 0]],
    "Spike times", show_plot_already=False,
    save_figure_to="%s-spikes.png" % simulation_id,
    xaxis="time (s)", yaxis="cell ID",
    linestyles='', linewidths='0', markers=['.'],
)

So:

python izhikevich-network.py
  1. On the generated LEMS file, run pynml -graph:
$ pynml LEMS_example_izhikevich2007network_sim.xml -graph 1
neuromllite >>> Initiating GraphViz handler, level 1, engine: dot, seed: 1234
Parsing: LEMS_example_izhikevich2007network_sim.xml
Traceback (most recent call last):
  File "/usr/bin/pynml", line 33, in <module>
    sys.exit(load_entry_point('pyNeuroML==0.5.6', 'console_scripts', 'pynml')())
  File "/usr/lib/python3.9/site-packages/pyneuroml/pynml.py", line 1644, in main
    evaluate_arguments(args)
  File "/usr/lib/python3.9/site-packages/pyneuroml/pynml.py", line 1165, in evaluate_arguments
    currParser.parse(args.lems_file)
  File "/usr/lib/python3.9/site-packages/neuroml/hdf5/NeuroMLXMLParser.py", line 77, in parse
    self.nml_doc = loaders.read_neuroml2_file(filename,
  File "/usr/lib/python3.9/site-packages/neuroml/loaders.py", line 239, in read_neuroml2_file
    return _read_neuroml2(nml2_file_name, include_includes=include_includes, verbose=verbose,
  File "/usr/lib/python3.9/site-packages/neuroml/loaders.py", line 279, in _read_neuroml2
    for include in nml2_doc.includes:
AttributeError: 'Property' object has no attribute 'includes'

Additional information

jnml seems to work fine:

jnml LEMS_example_izhikevich2007network_sim.xml -graph
 jNeuroML v0.10.1
(INFO) Reading from: /home/asinha/Documents/02_Code/00_mine/2020-OSB/NeuroML-Documentation/source/Userdocs/NML2_examples/LEMS_example_izhikevich2007network_sim.xml
(INFO) simCpt: Component(id=example_izhikevich2007network_sim type=Simulation)
Writing to: /home/asinha/Documents/02_Code/00_mine/2020-OSB/NeuroML-Documentation/source/Userdocs/NML2_examples/LEMS_example_izhikevich2007network_sim.gv
Have successfully run command: dot -Tpng  /home/asinha/Documents/02_Code/00_mine/2020-OSB/NeuroML-Documentation/source/Userdocs/NML2_examples/LEMS_example_izhikevich2007network_sim.gv -o /home/asinha/Documents/02_Code/00_mine/2020-OSB/NeuroML-Documentation/source/Userdocs/NML2_examples/LEMS_example_izhikevich2007network_sim.png

I expect it's because it's looking for includes in for include in nml2_doc.includes, but this rather simple NeuroML document generated by this doesn't have any includes?

<neuroml xmlns="http://www.neuroml.org/schema/neuroml2"  xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.neuroml.org/schema/neuroml2 https://raw.github.com/NeuroML/NeuroML2/development/Schemas/NeuroML2/NeuroML_v2.1.xsd" id="IzNet">
    <expOneSynapse id="syn0" gbase="65nS" erev="0mV" tauDecay="3ms"/>
    <izhikevich2007Cell id="iz2007RS0" C="100pF" v0="-60mV" k="0.7nS_per_mV" vr="-60mV" vt="-40mV" vpeak="35mV" a="0.03per_ms" b="-2nS" c="-50.0mV" d="100pA"/>
    <pulseGenerator id="pulseGen_0" delay="0ms" duration="10000ms" amplitude="0.098292 nA"/>
    <pulseGenerator id="pulseGen_1" delay="0ms" duration="10000ms" amplitude="0.076471 nA"/>
    <pulseGenerator id="pulseGen_2" delay="0ms" duration="10000ms" amplitude="0.033444 nA"/>
    <pulseGenerator id="pulseGen_3" delay="0ms" duration="10000ms" amplitude="0.010134 nA"/>
    <pulseGenerator id="pulseGen_4" delay="0ms" duration="10000ms" amplitude="0.080287 nA"/>
    <network id="IzNet">
        <population id="IzPop0" component="iz2007RS0" size="5"/>
        <population id="IzPop1" component="iz2007RS0" size="5"/>
        <projection id="proj" presynapticPopulation="IzPop0" postsynapticPopulation="IzPop1" synapse="syn0">
            <connection id="0" preCellId="../IzPop0[0]" postCellId="../IzPop1[0]"/>
            <connection id="1" preCellId="../IzPop0[0]" postCellId="../IzPop1[1]"/>
            <connection id="2" preCellId="../IzPop0[0]" postCellId="../IzPop1[3]"/>
            <connection id="3" preCellId="../IzPop0[1]" postCellId="../IzPop1[1]"/>
            <connection id="4" preCellId="../IzPop0[1]" postCellId="../IzPop1[3]"/>
            <connection id="5" preCellId="../IzPop0[2]" postCellId="../IzPop1[2]"/>
            <connection id="6" preCellId="../IzPop0[2]" postCellId="../IzPop1[3]"/>
            <connection id="7" preCellId="../IzPop0[3]" postCellId="../IzPop1[2]"/>
            <connection id="8" preCellId="../IzPop0[4]" postCellId="../IzPop1[1]"/>
        </projection>
        <explicitInput target="IzPop0[0]" input="pulseGen_0"/>
        <explicitInput target="IzPop0[1]" input="pulseGen_1"/>
        <explicitInput target="IzPop0[2]" input="pulseGen_2"/>
        <explicitInput target="IzPop0[3]" input="pulseGen_3"/>
        <explicitInput target="IzPop0[4]" input="pulseGen_4"/>
    </network>
</neuroml>

Simplification of "add" methods

Most of the classes have a long list of "add_this()", "add_that()" methods. Since Python makes introspection very easy, couldn't we replace all of these with a single "add()" method per class, which would do the right thing based on the object type?

This would greatly simplify the API and make it easier for users.

For example:

doc = NeuroMLDocument(id="my_doc")
net = Network(id="my_net")
doc.add(net)   # instead of doc.add_network(net)
pop = Population(...)
net.add(pop)   # instead of net.add_population(pop)

generateds_config.py doesn't run process_includes on the schema in config.py

I noticed that generateds_config.py loads the schema specified in config.py without processing it with process_includes.py. While this isn't a problem with the current NeuroML schema (which doesn't contain any includes), it could pose problems if the parser is regenerated from a schema file that does contain an include (e.g., if one wishes to regenerate the parser with a new schema containing custom elements that extends the current NeuroML schema). I therefore suggest that generateds_config.py be modified to run the file in config.py through the process_includes.process_include_files() function before passing it to lxml.objectify.parse(). If you accept this suggestion, I can submit a pull request that contains it.

remove getters and setters

GenerateDS creates methods for each object with names such as set_pulseGenerator. These are unnecessary because there is no complex logic in setting these attributes. GenerateDS probably provides a way for leaving out these getters and setters and this should be done.

Trying to set an attribute which isn't present should throw an error

In the following only the 2nd line is correct setting of an attribute in libNeuroML:

conn0 = Connection(id="0")
conn0.pre_cell_id="../%s/0/%s"%(conn.pre_cell, conn.pre_cell)
conn0.postCellId="../%s/0/%s"%(conn.post_cell, conn.post_cell)

However, the 3rd line throws no error, even though the name of the attribute is wrong.

In general an error should be thrown if an unknown attribute is set. Probably a change is needed in the core of generateDS...

Consitent pluralization in generateds_config.py

If attribute maxOccurs=1 for an element in the schema, NameTable in generateDS needs to be changed to reflect the fact that this should be pluralized:

Ending x append es, ending s append es and remove s. ending y remove y and append ies.

Examples missing from pypi tar (and other issues)

The 0.2.47 tar on pypi does not include the examples folder. I don't see any changes in the Manifest.in file, though.

The readme does not mention the new releases either in the changelog at the bottom. Any chance that could be updated (and git tags used to mark these releases on github too?) This just makes it easier for us downstream to match the pypi tarball to the github source when needed---like today when I was trying to see what commit makes the 0.2.47 release---and to see what's changed between releases.

Build improvements

  • Build against python 3.6, 3.7
  • Drop support for python 2.6. It's been EOL for half a decade. I think it's time.
  • Use pip instead of conda on travis (I know I'm to blame for using conda in the first place, but I've grown...). Conda's dependency tree resolution is so slow these days that it may have wiped out the speed gains anyway.
  • Use requirements.txt to set up a full development environment
  • Use extras_require to allow users to install all extra features
  • (minor) add __version_info__
  • Test against jsonpickle 0.9.6 and 1.0

No doc attribute for pulseGeneratorDL

The dimensionless pulse generator pulseGeneratorDL cannot be located in the usual way:

doc = loaders.NeuroMLLoader.load(NML_MODEL_PATH)
my_pgdls = doc.pulse_generator_dls

There is no such attribute in the loaded doc, only one for regular pulse_generators. It appears that the loader does not yet match the schema in this regard. I figure these attributes are auto-generated from some list, and since nml.py is in turn generated from generateDS.py, I figure it is best if I stay out of this one.

Tags to match PyPI releases

The latest release on PyPI is 0.2.53, but the latest one on GitHub is 0.2.50. Could the missing ones please be tagged here?

(Also the latest tags are not in the master branch---are they supposed to be public releases for users?)

Debug message and solution for cryptic "no attribute nsmap" error

If the lxml package is not installed, nml.py falls back on other xml parsers. There is a hard-code reliance on the nsmap attribute, though, in a few places, which I think you don't get without lxml. Rather than implement all the alternatives, I want to just catch these errors and encourage the user to install lxml in the debug message. All of this would go into nml.py, but I wasn't sure how much of it is auto-generated. Where can and can't I make lasting commits to nml.py?

Discrepancy in singular/plural for sequences

It seems the python libNeuroML adds a plural-suffix for subelements which are defined as sequences in the schema. This results in incorrect(grammatically) and confusing names like extracellular_propertieses.
I propose that the additional suffix be dropped in favour of the original tag in the NeuroML2 schema, the users can look-up the definition or just check for datatype in Python to determine if an attribute of the NeuroML document is a list or a single element.

issues encountered during generation of parser from modified schema

I recently tried to regenerate the libNeuroML parser (via generateDS 2.12e) in the latest revision on GitHub (3ac0279) using a modified copy of the NeuroML_v2beta2.xsd schema with an additional cell type added:

    <xs:complexType name="MyCell">
        <xs:complexContent>
            <xs:attribute name="gL" use="required"/>
            <xs:attribute name="gM" use="required"/>
        </xs:complexContent>
    </xs:complexType>

I noticed, however, that the Python variable associated with attribute 'gL' in the regenerated parser was named 'g_l' while that associated with attribute 'gM' was named 'gM'. While attempting to debug why camel-cased attributes were not consistently converted to lowercase+underscores, I observed that neuroml/nml/config.py contains

variables={'schema_name':'NeuroML_v2beta1.xsd'}

even though the contents of the copy of neuroml/nml/nml.py in the libNeuroML repo indicate that it was generated from NeuroML_v2beta2.xsd.

When I replaced the file listed in config.py with my modified copy of NeuroML_v2beta2.xsd, however, generateDS threw a RuntimeError because the maximum recursion depth was reached in the traverse_doc() function from generateds_config.py; raising the Python recursion limit to 1100 by adding

import sys
sys.setrecursionlimit(1100)

to generateds_config.py circumvented this problem. After this fix, the resulting parser consistently named the Python variables respectively associated with attributes 'gL' and 'gM' to 'g_l' and 'g_m'.

When I tried the above by modifying NeuroML_v2beta1.xsd, I didn't have to alter the recursion limit to get generateDS to process the file.

In light of the above, shouldn't config.py be updated to list NeuroML_v2beta2.xsd, generateds_config.py be updated to raise the recursion limit, and the copy of nml.py in the repo be regenerated? I can submit a pull request if so desired.

[Query] Can we pass a nml file's file object to load it

Hi all, We are using libNeuroML library to load .nml file for our use case. Currently we are passing file path to NeuroMLLoader.load function and it is working fine. I want to know if it is possible to directly pass a file object instead of file path to load nml.
our reasons:

  1. It will allow us to convert a string into file object using stringIO and pass that.
  2. We won't have to deal with file handling and issues related to it.

will it be good functionality to have? I am willing to work on this functionality.

@pgleeson

Add generation of equivalent libNeuroML for v1.8.1

We could generate an API using the v1.8.1 schemas and the updated generateDS and allow:

from neuromlv1 import NeuroMLDocument
from neuromlv1 import Network

etc.

This would probably require one large Schema to be created from the existing v1 Schemas, and there may be problems too with namespaces, but probably worth a try...

Replace "fromxx" with "from_"

Where "from" is desired as a keyword argument, this is replaced by "fromxx", since "from" is a Python keyword.

I think "fromxx" is ugly and a little confusing. I suggest replacing it with "from_"

Bring install procedure for end users into alignment with travis.yml

The installation documentation claims that all that is required to install libNeuroML is:

Use the standard install method for Python packages:

sudo python setup.py install

However, the travis.yml file requires Travis CI to do a ton of other things:

before_install:
    - sudo apt-get update -qq
    - sudo apt-get install -qq libhdf5-serial-dev
    - if [[ $TRAVIS_PYTHON_VERSION == "2.6" ]]; then pip install unittest2; fi
    - pip install cython numpy

# command to install dependencies
install:
  - "pip install lxml"
  - "pip install numpy"
  - "pip install numexpr"
  - "pip install . --use-mirrors"
  - "pip install jsonpickle"
  - "pip install pymongo"
  - if [[ $TRAVIS_PYTHON_VERSION != "3.2" ]]; then pip install simplejson; fi
  - "pip install tables"
  - "pip install -r requirements.txt --use-mirrors"

This means most users aren't installing libNeuroML correctly to get all tests to run as Travis CI does, meaning that we are getting a false sense of security from successful Travis CI builds. For example, HDF5 is a dependency for libNeuroML, but none of that comes in via the setup process.

Error on Windows but not on Ubuntu

I am getting a weird error, when I am using the library on Windows (inside Blender software). Although I when I parsed the same file with the same method on Ubuntu, this error didn't occur.

Traceback (most recent call last):
File "D:\Blender\2.79\python\lib\site-packages\neuroml\loaders.py", line 38, in nml2_doc
nml2_doc = nmlparse(file_name)
File "D:\Blender\2.79\python\lib\site-packages\neuroml\nml\nml.py", line 22342, in parse
rootObj.build(rootNode)
File "D:\Blender\2.79\python\lib\site-packages\neuroml\nml\nml.py", line 14592, in build
self.buildAttributes(node, node.attrib, already_processed)
File "D:\Blender\2.79\python\lib\site-packages\neuroml\nml\nml.py", line 14598, in buildAttributes
super(NeuroMLDocument, self).buildAttributes(node, attrs, already_processed)
File "D:\Blender\2.79\python\lib\site-packages\neuroml\nml\nml.py", line 5925, in buildAttributes
value = find_attr_value
('xsi:type', node)
File "D:\Blender\2.79\python\lib\site-packages\neuroml\nml\nml.py", line 493, in find_attr_value

namespace = node.nsmap.get(prefix)
AttributeError: 'xml.etree.ElementTree.Element' object has no attribute 'nsmap'

Do you have any suggestions on how it can be fixed? Thanks in advance

Some examples require temporary folder

Steve Marsh notes:

Whilst running Examples:

$ python arraymorph.py
Traceback (most recent call last):
File "arraymorph.py", line 69, in
writers.NeuroMLWriter.write(doc,fn)
File "/Library/Python/2.7/site-packages/neuroml/writers.py", line 16, in write
file = open(file,'w')
IOError: [Errno 2] No such file or directory: '/home/mike/testmorphwrite.nml'

$ python run_all.py

Running all examples...

Running arraymorph_generation.py
Traceback (most recent call last):
File "run_all.py", line 7, in
run_example("arraymorph_generation.py")
File "run_all.py", line 5, in run_example
exec "import %s"%ex_file[:-3]
File "", line 1, in
File "/Users/stevenjtmarsh/BIMPA/BIMPASoftwareGitrepo/NeuroML/libNeuroML/neuroml/examples/arraymorph_generation.py", line 71, in
writers.NeuroMLWriter.write(doc,fn)
File "/Library/Python/2.7/site-packages/neuroml/writers.py", line 16, in write
file = open(file,'w')
IOError: [Errno 2] No such file or directory: 'tmp/arraymorph.nml'

Manually creating ./tmp fixes this problem.

Hope this helps!

-- Steve Marsh

0.2.54: More Python 3 warnings

While updating to the newest release on Fedora 33, these warnings came up:

+ nosetests-3
......................../builddir/build/BUILD/libNeuroML-0.2.54/neuroml/arraymorph.py:45: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
  self.vertices = np.array(vertices)
........../usr/lib64/python3.9/unittest/case.py:550: ResourceWarning: unclosed file <_io.TextIOWrapper name='/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/examples/test_files/pyr_4_sym.cell.nml' mode='r' encoding='UTF-8'>
  method()
ResourceWarning: Enable tracemalloc to get the object allocation traceback
../usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
/usr/lib64/python3.9/site-packages/tables/attributeset.py:464: NaturalNameWarning: object name is not a valid Python identifier: 'column_-1'; it does not match the pattern ``^[a-zA-Z_][a-zA-Z0-9_]*$``; you will not be able to use natural naming to access this object; using ``getattr()`` will still work, though
  check_attribute_name(name)
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLHdf5Parser.py:503: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
/usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
/usr/lib64/python3.9/site-packages/tables/attributeset.py:464: NaturalNameWarning: object name is not a valid Python identifier: 'column_-1'; it does not match the pattern ``^[a-zA-Z_][a-zA-Z0-9_]*$``; you will not be able to use natural naming to access this object; using ``getattr()`` will still work, though
  check_attribute_name(name)
/usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLHdf5Parser.py:503: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
./usr/lib64/python3.9/site-packages/tables/attributeset.py:464: NaturalNameWarning: object name is not a valid Python identifier: 'property:color'; it does not match the pattern ``^[a-zA-Z_][a-zA-Z0-9_]*$``; you will not be able to use natural naming to access this object; using ``getattr()`` will still work, though
  check_attribute_name(name)
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLHdf5Parser.py:503: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
./builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLHdf5Parser.py:503: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
/usr/lib64/python3.9/site-packages/tables/attributeset.py:464: NaturalNameWarning: object name is not a valid Python identifier: 'property:color'; it does not match the pattern ``^[a-zA-Z_][a-zA-Z0-9_]*$``; you will not be able to use natural naming to access this object; using ``getattr()`` will still work, though
  check_attribute_name(name)
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLHdf5Parser.py:503: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
/usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLHdf5Parser.py:503: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/usr/lib64/python3.9/site-packages/tables/attributeset.py:464: NaturalNameWarning: object name is not a valid Python identifier: 'property:color'; it does not match the pattern ``^[a-zA-Z_][a-zA-Z0-9_]*$``; you will not be able to use natural naming to access this object; using ``getattr()`` will still work, though
  check_attribute_name(name)
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLHdf5Parser.py:503: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
....................../usr/lib64/python3.9/site-packages/tables/array.py:241: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  (oid, self.atom, self.shape, self._v_chunkshape) = self._open_array()
..<__array_function__ internals>:5: DeprecationWarning: Calling nonzero on 0d arrays is deprecated, as it behaves surprisingly. Use `atleast_1d(cond).nonzero()` if the old behavior was intended. If the context of this warning is of the form `arr[nonzero(cond)]`, just use `arr[cond]`.
/usr/lib/python3.9/site-packages/jsonpickle/ext/numpy.py:139: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  if obj.dtype == np.object:
/usr/lib/python3.9/site-packages/jsonpickle/ext/numpy.py:191: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  if dtype == np.object:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/test/test_writers.py:135: ResourceWarning: unclosed file <_io.TextIOWrapper name='/tmp/tmpaaqob6y6' mode='r' encoding='UTF-8'>
  doc = loader_method(filename)
ResourceWarning: Enable tracemalloc to get the object allocation traceback
..SS/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/test/test_writers.py:154: ResourceWarning: unclosed file <_io.TextIOWrapper name='/tmp/tmpalxs_aos' mode='r' encoding='UTF-8'>
  document = loader_method(filename)
ResourceWarning: Enable tracemalloc to get the object allocation traceback
./usr/lib64/python3.9/traceback.py:220: ResourceWarning: unclosed file <_io.TextIOWrapper name='tmpfile' mode='w' encoding='UTF-8'>
  tb.tb_frame.clear()
ResourceWarning: Enable tracemalloc to get the object allocation traceback
./builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:121: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:121: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:121: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:98: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:121: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:98: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:98: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:98: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:121: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:98: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:121: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
/builddir/build/BUILD/libNeuroML-0.2.54/neuroml/hdf5/NeuroMLXMLParser.py:98: DeprecationWarning: inspect.getargspec() is deprecated since Python 3.0, use inspect.signature() or inspect.getfullargspec()
  if 'properties' in inspect.getargspec(self.netHandler.handle_population)[0]:
.

Most of these look like they need minor tweaks. If you assign this to me @pgleeson , I can fix them up and open a PR. Cheers,

GateHHRates generates invalid NeuroML

I'm afraid I'm not sure if this is a bug in libNeuroML or jNeuroML, but I suspect the former. It appears to be something that broke recently.

My code to build NeuroML models originally created gates in the following way:

        gate_ml = neuroml.GateHHRates(id="gate_{0}_{1}".format(channel.name, gate.name),
                                      type="gateHHrates",
                                      instances = gate.power,
                                      forward_rate = fwd_rate,
                                      reverse_rate = rev_rate)

However, I've recently switched to a new PC and in the process have updated to the latest (development) branch of libNeuroML (neuroml.version is 0.2.4). With this, I get the following exception when trying to construct a GateHHRates:

Traceback (most recent call last):
  File "neuroml_convert.py", line 331, in <module>
    "test")
  File "neuroml_convert.py", line 191, in convert_simulation
    cell_type_ml,cell_type_ml_path = load_convert_save_cell_type(type_path)
  File "neuroml_convert.py", line 155, in load_convert_save_cell_type
    cell_type_ml, channels_ml = convert_cell_type(cell_type)
  File "neuroml_convert.py", line 117, in convert_cell_type
    channel_ml, density_ml = convert_channel(c, prefix=cell.id+"_")
  File "neuroml_convert.py", line 55, in convert_channel
    reverse_rate = rev_rate)
TypeError: __init__() got an unexpected keyword argument 'type'

So I removed the type="gateHHrates" line (it seemed odd that it was needed in the first place), which works and generates the following NeuroML:

        <gate id="gate_kSlow_ks" instances="1">
            <forwardRate midpoint="2.96mV" rate="0.2per_ms" scale="7.74mV" type="HHSigmoidRate"/>
            <reverseRate midpoint="14.07mV" rate="0.05per_ms" scale="-6.1mV" type="HHSigmoidRate"/>
        </gate>

However, when I now try to convert the generated NeuroML into a NEURON model using jNeuroML, I get an error that "gate" is an unknown element type. Manually changing the element type in the NeuroML file to "gateHHrates" fixes the problem. My jNeuroML versions:

 jNeuroML v0.7.0
    org.neuroml.import  v1.4.1
    org.neuroml.export  v1.4.1
    org.neuroml.model   v1.4.1
    jLEMS               v0.9.7.3

Again, I'm not sure if the problem is that libNeuroML generates invalid NeuroML, or that jNeuroML thinks valid NeuroML is invalid. Apologies if this is posted to the wrong repository!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.