Giter VIP home page Giter VIP logo

spiketorch's Introduction

SpikeTorch

Python package used for simulating spiking neural networks (SNNs) in PyTorch.

At the moment, the focus is on replicating the SNN described in Unsupervised learning of digit recognition using spike-timing-dependent plasticity (original code found here, extensions thereof found in my previous project repository here).

We are currently interested in applying SNNs to simple machine learning (ML) tasks, but the code can be used for any purpose.

Requirements

All code was developed using Python 3.6.x, and will fail if run with Python 2.x. Use pip install -r requirements.txt to download all project dependencies. You may have to consult the PyTorch webpage in order to get the right installation for your machine.

Setting things up

To begin, download and unzip the MNIST dataset by running ./data/get_MNIST.sh. To build the spiketorch package from source, change directory to the top level of this project and issue pip install . (PyPI support hopefully coming soon). After making changing to code in the spiketorch directory, issue pip install . -U or pip install . --upgrade at the top level of the project.

To replicate the SNN from the above paper, run python examples/eth.py. There are a number of optional command-line arguments which can be passed in, including --plot (displays useful monitoring figures), --n_neurons [int] (number of excitatory, inhibitory neurons simulated), --mode ['train' | 'test'] (sets network operation to the training or testing phase), and more. Run python code/eth.py --help for more information on the command-line arguments.

Note: This is a work in progress, including the replication script examples/eth.py and other modifications in examples/.

Background

One computational challenge is simulating time-dependent neuronal dynamics. This is typically done by solving ordinary differential equations (ODEs) which describe said dynamics. PyTorch does not explicitly support the solution of differential equations (as opposed to brian2, for example), but we can convert the ODEs defining the dynamics into difference equations and solve them at regular, short intervals (a dt on the order of 1 millisecond) as an approximation. Of course, under the hood, packages like brian2 are doing the same thing. Doing this in PyTorch is exciting for a few reasons:

  1. We can use the powerful and flexible torch.Tensor object, a wrapper around the numpy.ndarray which can be transferred to and from GPU devices.

  2. We can avoid "reinventing the wheel" by repurposing functions from the torch.nn.functional PyTorch submodule in our SNN architectures; e.g., convolution or pooling functions.

The concept that the neuron spike ordering and their relative timing encode information is a central theme in neuroscience. Markram et al. (1997) proposed that synapses between neurons should strengthen or degrade based on this relative timing, and prior to that, Donald Hebb proposed the theory of Hebbian learning, often simply stated as "Neurons that fire together wire together." Markram et al.'s extension of the Hebbian theory is known as spike-timing-dependent plasticity (STDP).

We are interested in applying SNNs to machine learning problems. We use STDP to modify weights of synapses connecting pairs or populations of neurons in SNNs. In the context of ML, we want to learn a setting of synapse weights which will generate appropriate data-dependent spiking activity in SNNs. This activity will allow us to subsequently perform some ML task of interest; e.g., discriminating or clustering input data.

For now, we use the MNIST handwritten digit dataset, which, though somewhat antiquated, is simple enough to develop new machine learning techniques on. The goal is to find a setting of synapse weights which will allow us to discriminate categories of input data. Based on historical spiking activity on training examples, we assign each neuron in an excitatory population an input category and subsequently classify test data based on these assignments.

Contributors

spiketorch's People

Contributors

hananel-hazan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

spiketorch's Issues

Type Checking

Certain constructors and functions should implement type checking, to avoid difficult-to-undestand errors that would be thrown otherwise. This seems like a good implementation. For example, we might use function decorators to implement this as follows:

@accepts(NeuronGroup, dict, str, float)
def step(self, inpts, mode, dt):
        ...

Generic NeuronGroup

As in BRIAN, I feel that it would be beneficial to discard objects such as LIFGroup and AdaptiveLIFGroup and replace them with a single, generic NeuronGroup. One would pass in n_neurons (as usual), but also arguments giving the equation(s) of the neuronal dynamics (e.g., 'dv/dt = -v') and other pertinent information (e.g., threshold and reset behavior).

The main hurdle is parsing equations into actionable torch.Tensor operations. (sympy)[http://www.sympy.org/en/index.html] has support for converting symbolic mathematics into theano functions. I'm not sure if we can do something similar, but it would certainly be nice to have. If it were possible, this would be a large and time-consuming addition.

Questions in groups.py

I had an issue with a line of code in groups.py. Specifically, line 96 in groups.py. I was confused as to why it is self.refrac_count[self.s] = dt * self.refractory and not self.refrac_count[self.s] = self.refractory. This is under LIFGroup class and under the "step" function. Why is there a "dt"?

So, we are checking which neurons spiked and the ones that spiked will now have to wait for refractory period to end. I can see that in line 89 you are decrementing the counter by dt per time step which makes sense therefore, why is setting it not simply what I mentioned above?

Another question is for line self.x[self.s.byte()] = 1.0 on line 104. Looking at the documentation for "byte()", it mentions that it is to cast tensor to byte type. Why are we casting it to byte? This way of picking elements usually has a boolean array when calling it. So I thought it should be `self.x[self.s] since self.s is an array of booleans.

Object detection with SNN

Hi,
As we know that, nowadays, most works of SNN focus on Classification problems. So, do you think if it is possible to solve object detection problem with SNN since we need to solve regression problem in detection part. Thank you very much.

Refactor Network step() function

Currently, the step() function in the Network class accepts arguments mode, inpts, time. If we are to separate out the training / testing of networks from the object definition, we should be rid of the mode parameter (which gives the train or test phase).

Here's what I'm thinking:

  1. Remove the argument mode.
  2. Always update Synapses which have STDP "enabled".
  3. "Enable" STDP on learnable Synapses during training; disable during test.

The enabling of STDP could be a boolean class-level attribute in a generic Synapse function.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.