Giter VIP home page Giter VIP logo

Comments (10)

Hananel-Hazan avatar Hananel-Hazan commented on June 18, 2024

Thank you for raising this issue.
From your description the weights should be updated...

Can you post two another figures.

  1. figure that plot the voltage of each neurons vs time.
  2. figure that plots the input values / spikes train that the input layer received.

from bindsnet.

rafaelblevin821 avatar rafaelblevin821 commented on June 18, 2024

What dataset are you training it on?

It looks like the network is not learning.

Can you save the weights and share them?

I use this code to save weights:

    if epoch == 0 or (epoch + 1) % 1 == 0:
        np.savetxt(
            os.path.join('network', cur_time, 'weights', f'weights_epoch_{epoch + 1}.csv'),
            network.connections[('X', 'Ae')].w.detach().cpu().numpy(), delimiter=',', fmt='%0.3f')
        print(f"Weights saved for epoch {epoch + 1}")

from bindsnet.

289848771 avatar 289848771 commented on June 18, 2024

你在什么数据集上训练它?

看起来网络没有学习。

你能保存权重并分享它们吗?

我使用此代码来保存权重:

    if epoch == 0 or (epoch + 1) % 1 == 0:
        np.savetxt(
            os.path.join('network', cur_time, 'weights', f'weights_epoch_{epoch + 1}.csv'),
            network.connections[('X', 'Ae')].w.detach().cpu().numpy(), delimiter=',', fmt='%0.3f')
        print(f"Weights saved for epoch {epoch + 1}")

What dataset are you training it on?

It looks like the network is not learning.

Can you save the weights and share them?

I use this code to save weights:

    if epoch == 0 or (epoch + 1) % 1 == 0:
        np.savetxt(
            os.path.join('network', cur_time, 'weights', f'weights_epoch_{epoch + 1}.csv'),
            network.connections[('X', 'Ae')].w.detach().cpu().numpy(), delimiter=',', fmt='%0.3f')
        print(f"Weights saved for epoch {epoch + 1}")

Hello, I am trying to use some inverse kinematics data of a robotic arm to drive the robotic arm to perform circular trajectory control through torque. Among them, I encoded the joint position and speed information of the robotic arm and input it to the SNN network built by BindsNet. Then after its training and learning, the output layer outputs spikes. I converted the spikes into torque commands through a simple linear formula and output Give the robotic arm and control its movement.

It is indeed what you said. The neural network I built did not learn, and its connection weights never changed.

I tried to use a txt file to save the weights of the first 500 steps of the network run (can be seen in the attachment),
weight.txt
but we can see that all the weights of the connection always remain the same as the initial value (I set the connection matrix to a full connection, and each The element value is 1.6. The weights did not change during the entire run)

Since this connected layer is huge, I cannot save the weights for 5000 steps. My source layer has 70,000 LIF neurons, and my target layer has 700 LIF neurons. Their initial connection method is fully connected, and their connection matrix is very large. Here is my weight print statement:

weight = str(network.monitors['gc_pc_monitor'].get("w"))
weight_file = open("Circle_Trajectory_Save/weight.txt", "w")
weight_file.write(weight)
weight_file.close()

I could try saving the weights over different time periods if you need.

from bindsnet.

289848771 avatar 289848771 commented on June 18, 2024

感谢您提出这个问题。根据您的描述,权重应该更新......

你能再贴两个数字吗?

  1. 绘制每个神经元的电压与时间的关系图。
  2. 绘制输入层接收的输入值/峰值训练的图。

Hello, thank you very much for your reply.

I will provide the two figures you proposed. But before providing it, I have to briefly explain the purpose of my network design and the SNN network structure. I am trying to use some inverse kinematics data of the robotic arm to drive the robotic arm to perform circular trajectory control through torque. Among them, I encode the joint position and speed information of the robotic arm and input it into the SNN network built by BindsNet. Then after training and learning, the output layer outputs spikes. I convert the spikes into torque commands via a simple linear formula and give the output to the robotic arm and control its movement.

The above briefly describes the purpose of my SNN network model. Let's take a look at the simple schematic diagram of my SNN network:
SNN_network

As shown in the picture, my network has two input layers: MF layer and CF layer. My network only has one output layer: DCN layer. Among them, only the STDP learning rule is used in the GC-PC layer. The GC layer is the source layer with 70,000 LIF neurons. The PC layer is the target layer and has 700 LIF neurons. There is a fully connected relationship between them. So I set the connection matrix of GC-PC to an all-one matrix, where the value of each element is 1.6.

The following is the figures you need.

The first is the relationship between voltage and time for each neuron layer, as shown below (My SNN network model has a total of five neuron layers, 2 of which are input layers and have no voltage attributes, so they can only provide the remaining The voltage-time relationship diagram of GC, PC, and DCN).
voltages

Then there is a diagram of the input layer receiving the input spikes (where MF and CF are input layers), as shown below.
Input_spikes

from bindsnet.

Hananel-Hazan avatar Hananel-Hazan commented on June 18, 2024

Thank you for providing the figures.
It doesn't seem like the GC part is getting any information from the MF. Since the PC and DCN appear to be very active, that suggests their activity is coming from the CF. Because the GC is not active, it seems that either the connection between MF and GC is too weak, or the MF has less information to transmit, or the encoding of the MF input doesn't elicit enough activity.

Suggestions:

  1. Since the DCN already receives input from both MF and CF, can you route the output of CF to the GC as well (make it similar to the DCN)? It maybe that the input from MF alone may not be strong enough, or some sort of encoding might be needed to transform the given input.

  2. Can you increase the weights between MF and GC by factors of 10, 100, or 1000? I would like to see some voltage changes in the GC part of the voltage figure.

from bindsnet.

289848771 avatar 289848771 commented on June 18, 2024

感谢您提供这些数字。 GC 部分似乎没有从 MF 获取任何信息。由于 PC 和 DCN 看起来非常活跃,这表明它们的活动来自 CF。 因为 GC 没有处于活动状态,所以似乎 MF 和 GC 之间的连接太弱,或者 MF 要传输的信息较少,或者 MF 输入的编码没有引起足够的活动。

建议:

  1. 由于 DCN 已经接收来自 MF 和 CF 的输入,您能否将 CF 的输出也路由到 GC(使其类似于 DCN)?可能仅来自 MF 的输入可能不够强大,或者可能需要某种编码来转换给定的输入。
  2. 您可以将 MF 和 GC 之间的权重增加 10、100 或 1000 倍吗?我希望看到电压图的 GC 部分的一些电压变化。

Thank you very much for your detailed advice. I apologize for the delay in replying.

I would definitely also try routing the output of the CF to the GC, similar to a DCN. Additionally, I will try increasing the weight between MF and GC by a factor of 10, 100, and 1000 to observe the voltage changes in the GC portion of the voltage plot.

It is worth mentioning that due to my algorithm settings, only 7 of the 70,000 neurons in the GC at each time step can and must be activated (the algorithm requires that the sparse activation characteristics of the GC be maintained). I'm thinking about whether this setting will cause the MF-GC connection you said to be too weak. If it is necessary to maintain such a setting of sparse activation performance of GC, is there any way to improve the activity of GC so that the activity of PC is more affected by GC instead of CF.

Thank you very much for your help and I apologize again for my delayed reply.

from bindsnet.

Hananel-Hazan avatar Hananel-Hazan commented on June 18, 2024

It is worth mentioning that due to my algorithm settings, only 7 of the 70,000 neurons in the GC at each time step can and must be activated (the algorithm requires that the sparse activation characteristics of the GC be maintained). I'm thinking about whether this setting will cause the MF-GC connection you said to be too weak. If it is necessary to maintain such a setting of sparse activation performance of GC, is there any way to improve the activity of GC so that the activity of PC is more affected by GC instead of CF.

If only seven neurons in the GC can be active at each step, there might be a need for an echo chamber to enhance the activity. This echo chamber could be a liquid state machine, which is a type of recurrent network, positioned between the GC and PC.

from bindsnet.

289848771 avatar 289848771 commented on June 18, 2024

It is worth mentioning that due to my algorithm settings, only 7 of the 70,000 neurons in the GC at each time step can and must be activated (the algorithm requires that the sparse activation characteristics of the GC be maintained). I'm thinking about whether this setting will cause the MF-GC connection you said to be too weak. If it is necessary to maintain such a setting of sparse activation performance of GC, is there any way to improve the activity of GC so that the activity of PC is more affected by GC instead of CF.

If only seven neurons in the GC can be active at each step, there might be a need for an echo chamber to enhance the activity. This echo chamber could be a liquid state machine, which is a type of recurrent network, positioned between the GC and PC.

Thank you very much for your suggestion, I feel very inspired by this suggestion. I will try the method you provided.

In addition, I would like to ask you: Can the echo chamber you are talking about be a recurrent network based on a spiking neural network? Does BindsNET support building recurrent networks? Or does BindsNET have a library for developing recurrent networks?

Thanks again for your help.

from bindsnet.

Hananel-Hazan avatar Hananel-Hazan commented on June 18, 2024

BindsNET support recurrent connection, see the reservoir example

from bindsnet.

289848771 avatar 289848771 commented on June 18, 2024

BindsNET support recurrent connection, see the reservoir example

Thank you so much, I will try it next.

from bindsnet.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.