Giter VIP home page Giter VIP logo

binocular_balance's Introduction

Binocular Balance/Deprivation through Hebbian Learning and BCM Theory

Generated by DALL·E

This is the final project for the course, "Introduction to Neurophysics" in National Tsing Hua University. The goal of this project is to study the binocular balance.

Introduction

Neural networks, the intricate webs of neurons in our brains, are not just conduits for electrical impulses but the very foundation of learning and perception. Among the various mechanisms governing their adaptability, Hebbian learning stands out as a pivotal concept.

In the followings, we will briefly introduce some basic concepts of Hebbian learning, BCM theory and binocular balance.

1. Hebbian Learning

Hebbian learning, a fundamental concept in neuroscience, is named after Donald Hebb who proposed it in his 1949 book "The Organization of Behavior." It's often summarized by the phrase "neurons that fire together, wire together." This principle suggests that synaptic connections between neurons are strengthened when they are activated simultaneously. Hebbian learning is crucial for understanding how experiences and behaviors can lead to changes in the brain's neural networks. It's a form of synaptic plasticity, playing a key role in learning and memory. This concept has been instrumental in the development of theories about neural network function and is a foundational element in various fields, including computational neuroscience and psychology.

Note

The important concept in Hebbian learning is that of synaptic plasticity, the ability of synapses to strengthen or weaken over time. Consider $i$-th presynaptic neuron and $j$-th postsynaptic neuron. The general dynamics of synaptic weight is described as following equation:

$$ \tau_w \frac{dw_{ji}}{dt} = c_0 + c_1^{pre}(w_{ji}) x_i + c_1^{post}(w_{ji}) y_j + c_2^{pre}(w_{ji}) x_i^2 + c_2^{post}(w_{ji}) y_j^2 + c_{11}^{corr}(w_{ji}) y_jx_i + \mathcal{O}(3) $$

The Hebb rule make a simplest approach to fix $C_{11}^{corr}$ to a constant. The discrete version of Hebb rule is:

$$ w_{ji}(t+1) = w_{ji}(t) + \gamma y_j(t) x_{i}(t) $$

Or, in continuous limit:

$$ \tau_w \frac{dw_{ji}}{dt} = y_j x_i $$

where $\tau_w$ is the time constant of the synaptic weight, $y_j$ is the postsynaptic activity, and $x_i$ is the presynaptic activity.

Tip

On the other hand, we can write down the the dynamics for neurons' activity as following:

$$ \tau \frac{dy_j}{dt} = -y_j + G\left(\sum_i w_{ji} x_i\right) $$

where $\tau$ is the time constant of the neuronal activity, and $G$ is the gain function.

From experiments, we know that the dynamics of neuronal activity is faster than the dynamics of synaptic weight, which is $\tau \ll \tau_w$. And we will have the following approximation (Steady-state approximation):

$$ y_j = G\left(\sum_i w_{ji} x_i\right) $$

Then substitute $y_j$ into the equation of synaptic weight, we will have:

$$ \tau_w \frac{dw_{ji}}{dt} = G\left(\sum_k w_{jk} x_k\right) x_i $$

We can see that if $x_i$ dominate the sum and hit the threshold of the gain function $G$, the synaptic weight will be increased.

Important

However, the synaptic weight will diverge if there is no inhibition. Consequently, we need to introduce the Oja's rule to renormalize the synaptic weight. The oja-modified Hebbian learning is:

$$ \tau_w \frac{dw_{ji}}{dt} = y_j x_i - w_{ji}y_j^2 $$

And the discrete version is:

$$ w_{ji}(t+1) = w_{ji}(t) + \eta \left[G\left(\sum_k w_{jk}(t) x_k\right)x_i-w_{ji}(t)G\left(\sum_k w_{jk}(t) x_k\right)^2\right] $$

where $\eta$ is the learning rate.

2. BCM Theory

BCM theory, a pivotal concept in neuroscience, extends the principles of Hebbian learning by introducing a dynamic threshold for synaptic plasticity. BCM theory, formulated in the early 1980s, proposes that the strength of synaptic connections is not only determined by simultaneous neuron activations but also influenced by the history of neuronal activity. The BCM model's innovative feature is its variable threshold, which adapts based on the neuron's previous firing patterns, allowing for a more nuanced understanding of learning and memory processes in the brain. This dynamic threshold mechanism is key to explaining both synaptic strengthening (long-term potentiation) and weakening (long-term depression), offering significant insights into neural adaptability and function.

Note

The BCM theory can be described as following equation:

$$ \tau_w \frac{dw_{ji}}{dt} = y_j\left(y_j - \theta_j\right)x_i $$

where $\theta_j$ is a dynamical threshold of the BCM theory, which can be described as following:

$$ \theta_j(t) = \langle y_j^2(t) \rangle = \frac{1}{\tau} \int_0^t y_j^2(t')e^{-(t-t')/\tau}dt' $$

Important

Combine the Hebbian learning and BCM theory, we will have the following equation:

$$ \tau_w \frac{dw_{ji}}{dt} = y_j\left(y_j - \theta_j\right)x_i - w_{ji}y_j^2 $$

And the discrete version is:

$$ w_{ji}(t+1) = w_{ji}(t) + \eta \left[y_j\left(y_j - \theta_j\right)x_i - w_{ji}y_j^2\right] $$

and

$$ y_j(t) = G\left(\sum_i w_{ji}(t) x_i\right) $$

Which are the iteration what we will use in the following simulations.

Note that in this case, the synaptic weight will NOT CONVERGE, since the oja's term is in the same order as the BCM's term.

3. Binocular Balance

Binocular balance, a critical aspect of our visual system, ensures a unified and coherent visual experience. It integrates distinct images from each eye, harmonizing them for depth perception and spatial awareness. This process is vital for constructing a stable, accurate representation of our three-dimensional world, illustrating the complexity of neural processing in visual perception and the balance between sensory input and neural activity.

The Hebbian learning rule, if applied simplistically to a multiple-input system like the visual cortex, might lead to a dominance of stronger inputs over weaker ones. However, in reality, sensory inputs are not always of equal strength, and the brain must adapt to this imbalance. In extreme cases, such as with a sensory impairment, this can disrupt neural balance.

Note

In clinical practice, the treatment for amblyopia (lazy eye) often involves covering the normal eye to enhance neural connections in the amblyopic eye. This approach is effective in children, where neural connections are still adaptable, but less so post-adolescence when these connections become more fixed.

Methodology

1. Binocular Balance

Consider a bi-sensory system model, we introduce a negative bias to one of sensory while maintaining a neutral bias in the other. The system undergoes three distinct phases:

  1. Pre-treatment, where both inputs receive equal random arrays.
  2. Treatment, where the input strength to the normal sensory is deliberately reduced.
  3. Post-treatment, where the normal sensory input strength is restored, and the learning rate is significantly reduced to simulate aging effects in neural plasticity.

The detailed methods are in bb.ipynb.

2. Binocular Deprivation

We construct another bi-sensory system model. In this bi-sensory system model focusing on binocular deprivation, both sensory inputs are initially unbiased. The model involves five phases:

  1. Normal Rearing, where both inputs receive identical arrays.
  2. Monocular Deprivation, reducing the input strength to one sensory system.
  3. Binocular Deprivation, reducing input strength to both sensory systems.
  4. Reverse Suture, restoring the initially reduced input to one sensory.
  5. Binocular Recovery, where both systems receive full-strength, unbiased inputs.

The detailed methods are in bd.ipynb.

Results

1. Binocular Balance

Weight value before/after the treatment

The provided figure illustrates synaptic weight (*average) changes before, during, and after the treatment. Initially, the synaptic weight of the normal sensory input is stronger compared to the amblyopic sensory input. During the treatment phase, we observe a more rapid reduction in the synaptic weight of the normal sensory input. Post-treatment, the synaptic weight of the amblyopic sensory input becomes slightly stronger than that of the normal sensory, achieving a more balanced signal transmission to the cortex.

2. Binocular Deprivation

Weight value between sensory neuron and cortex neuron under different conditions

The above figure illustrates synaptic weight changes between the right/left sensory neuron and the cortex neuron under various conditions:

  1. Normal Rearing: Equal synaptic weights for both sensory inputs.
  2. Monocular Deprivation: Reduced synaptic weight in the deprived sensory neuron.
  3. Binocular Deprivation: Slight reduction in synaptic weights for both neurons.
  4. Reverse Suture: Restoration of the initially deprived neuron's synaptic weight, coupled with a reduction in the other.
  5. Binocular Recovery: An increase in synaptic weights for both neurons.

Comparing these results with real experimental data.

Real experiment

Our model aligns closely except in the final phase, where the actual experiment achieves binocular balance. This discrepancy offers an opportunity for further investigation into the model's parameters or assumptions.

Conclusions

Based on the results, we have following conclusions:

  1. Effective amblyopia treatment involves reducing input strength to the normal sensory, demonstrating the adaptability of neural connections.
  2. Under typical conditions, binocular balance is naturally achieved through the mechanisms of Hebbian learning and BCM theory.
  3. In the case of binocular deprivation, early-age synaptic weight adjustments are feasible due to higher learning rates.
  4. Real-world experiments on binocular deprivation show that post-recovery phase synaptic weights can balance out with equal strength inputs, even if initial synaptic weights were imbalanced.

References

binocular_balance's People

Contributors

jim137 avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.