Giter VIP home page Giter VIP logo

Comments (9)

robertleeplummerjr avatar robertleeplummerjr commented on May 3, 2024 1

Really, why there is different RNN and DenseNet APIs? They should be same, it's just an abstraction.

The idea of brain.js is simplicity first. By not abstracting the base neural nets, we can optimize towards speed, and keep the learning curve low for understanding what is actually happening in the neural net, keeping things composable and understandable. However, I'd be open to abstracting if it meant meeting that goal. Also, the LSTM and GRU are abstractions. In the end, I think we wanted to just get it working 😋. Like Addy Osmani said: "First do it. Then do it right. Then do it better."
Also, I’m unaware of the “DenseNet” (a Densely Connected Convolutional Networks?)
in our codebase that you speak of.

Also, I really think you are over-bloating the code, really, guys, review the codebase, we at synaptic2 have a RNN in around 7 times smaller amount of lines.

I applaud your hard work towards less is more, and the synaptic project is fantastic. The code we started with was https://github.com/karpathy/recurrentjs so it is just a reflection of that code aimed toward the brain.js api. We also have es6 src, which is compiled into es5 and a special browser (and browser min) file. This is all done automatically, but does add to the code base. I’m also interested where the figure “7 times smaller” comes from, would be helpful for comparing other projects in the future. We are also in the middle of removing things like tests that bring in dependencies. Does synpatic2 use matrix operations?

And the last thing, why for gods sake you do have an internal matrices lib? We have scijs, sylvester, vectorious, or you're trying to build your own ecosystem?

The actual matrix library was brought in from recurrentjs, just simplified it and cut down on matrix instantiation, and split it apart so that when we do .toFunction() we could get the inner value of the objects and reuse as a non-function operation, an example that I sent to my mother which outputs “hi mom!”. Also, the actual matrix code fits in about 8 lines:

class Matrix {
  constructor(rows, columns) {
    this.rows = rows;
    this.columns = columns;
    this.weights = Float64Array(rows * columns);
    this.recurrence = Float64Array(rows * columns);
  }
}

The rest of it are some left over (not used, and thank you for bringing that to my attention) methods from recurrentjs, and utility functions for going to and from json, and math. The math was again brought in by recurrentjs and is there doing very simple operations that will eventually be made to run on the gpu where available. If there is a library that would give us what we have (reused matrices, relu, tanh, rowPluck, sampleI, maxI), I’m all for using it.

Thank you for your time in looking at our codebase!

from brain.js.

Dok11 avatar Dok11 commented on May 3, 2024

@Jabher, maybe you can help? :)

from brain.js.

Jabher avatar Jabher commented on May 3, 2024

I would suggest you had to have some general parent to inherit from at some point of time 👀

Really, why there is different RNN and DenseNet APIs? They should be same, it's just an abstraction.

Also, I really think you are over-bloating the code, really, guys, review the codebase, we at synaptic2 have a RNN in around 7 times smaller amount of lines.

And the last thing, why for gods sake you do have an internal matrices lib? We have scijs, sylvester, vectorious, or you're trying to build your own ecosystem? :trollface:

from brain.js.

Jabher avatar Jabher commented on May 3, 2024

Also you should probably think about nesting, splitting and merging of all this stuff. Video processing (e.g.) is actually ConvNet -> RNN -> Dense and I see no API to implement something of that kind (omitting the fact there is no convnet now). I mean, you do not have API like new Net([ConvNet, RNN, Dense])

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on May 3, 2024

Also you should probably think about nesting, splitting and merging of all this stuff. Video processing...

I have no idea what you are talking about here, not to sound negative. We do plan on adding convolutional neural networks, but have not started yet.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on May 3, 2024

To be clearer on the original issue, the recurrent neural net currently supports objects like

[
  {input: "a string", output: "another string"}
]
[
  {input: [100], output: "another string"}
]
[
  {input: [100], output: [1]}
]
[
  {input: [true], output: [false]}
]

examples here
The current implementation treats each value as a unique input, so it just doesn't support objects like { r: 0.03, g: 0.7, b: 0.5 }, if it is possible to change, how?

from brain.js.

Jabher avatar Jabher commented on May 3, 2024

The idea of brain.js is simplicity first. By not abstracting the base neural nets, we can optimize towards speed, and keep the learning curve low for understanding what is actually happening in the neural net, keeping things composable and understandable. However, I'd be open to abstracting if it meant meeting that goal. Also, the LSTM and GRU are abstractions. In the end, I think we wanted to just get it working 😋.

Possibly I explained wrong. I've been talking about class Net extends AbstractNet, class RNN extends AbstractNet and so on and to keep logic like train fn and so in this AbstractNet.

Also, I’m unaware of the “DenseNet” (a Densely Connected Convolutional Networks?)
in our codebase that you speak of.

I was talking about Dense layers (also known as Fully-Connected). Just a common layer, foundation of any net (together with activation layer) :).

I’m also interested where the figure “7 times smaller” comes from, would be helpful for comparing other projects in the future

We're in progress of everything, so showing something is kinda pointless, but we're using rushter's repo as a something we're cheching against (to make sure we're not doing something wrong with matrices), and more or less amount of SLOC is usually the same. RNN here is 100 lines:
https://github.com/rushter/MLAlgorithms/blob/master/mla/neuralnet/layers/recurrent/rnn.py

Does synpatic2 use matrix operations?

Well. We're more about tensors. Heavily. We're thinking only in terms of tensors now ;)

If there is a library that would give us what we have (reused matrices, relu, tanh, rowPluck, sampleI, maxI), I’m all for using it.

At stage 1 ("just make it work") we've been using vectorious (randomly selected from bunch of existing libs) for math operations and webmonkeys (also, random choice) as a prototype of gpu-accelerated matrix ops. Now we're onto our custom multi-target BLAS lib

Also you should probably think about nesting, splitting and merging of all this stuff. Video processing...

I have no idea what you are talking about here, not to sound negative. We do plan on adding convolutional neural networks, but have not started yet.

Just was giving an example. You probably should consider an option of communication between different layers, if user will want to stack them in any order. Convnet was just an example

Aaaand a last question

The current implementation treats each value as a unique input, so it just doesn't support objects like { r: 0.03, g: 0.7, b: 0.5 }, if it is possible to change, how?

As I said previously - consider some refactoring. Your fully-connected layer do support this kind of input, so you can extract this logic and apply it into other kind of layers (rnn, lstm), so you will be able to have same API in all layers. Inheritance mechanism is good here as long as you're thinking in term of OOP. You probably want to simply want to use this fn https://github.com/harthur-org/brain.js/blob/master/src/neural-network.js#L256 here https://github.com/harthur-org/brain.js/blob/master/src/recurrent/rnn.js#L408 somehow.

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on May 3, 2024

Possibly I explained wrong.

It could be that I am still naive, though eager to learn!

I was talking about Dense layers

Oh ok, cool.

We're in progress of everything

That is what I hear, and making good progress, no doubt!

RNN here is 100 lines

Nice! Thought I see it uses numpy (as nearly all python implementations of an rnn use that I've seen), not really a fair comparison, brain.js uses no third party plugins for building the neural net structure (at this time, though we do want to address that at some later date), specifically the recurrent neural net. The python code also doesn't have a toFunction method, which is sort of like a double implementation that removes dependency of any type. Not that it isn't a good point though 😛 .

This is the branch we've implemented Vocab on: https://github.com/harthur-org/brain.js/tree/rnn-train-pattern into the train method so that the api is very close to the same as the ANN.

Specifically here: https://github.com/harthur-org/brain.js/blob/rnn-train-pattern/src/recurrent/rnn.js#L210

This is fed in formatted data from here: https://github.com/harthur-org/brain.js/blob/rnn-train-pattern/src/recurrent/rnn.js#L740

Vocab is very similar to the inputs from formatData, but they are specific, not dynamic.

And then we reverse the process after training, with the run method, which predicts what comes next (or in the case input is blank, it creates an entire prediction) and outputs that here: https://github.com/harthur-org/brain.js/blob/rnn-train-pattern/src/recurrent/rnn.js#L750

In the case of the ANN that is brain.js' default network brain.NeuralNetwork each weight is a number, and we simply iterate over the values from formatData as if they were an index, or as an index if the input is an array, and do something similar on the output. In the case of the rnn, each input is an array (or a plucked matrix row https://github.com/harthur-org/brain.js/blob/rnn-train-pattern/src/recurrent/matrix/row-pluck.js), how would you suggest we achieve the output?

As you are probably aware, I have a lot to learn, and I'm sure I sound naive, I apologize for that, and am learning as much as possible as I go. Thank you again for your awesome feedback!

from brain.js.

robertleeplummerjr avatar robertleeplummerjr commented on May 3, 2024

This is now completed with https://gist.github.com/robertleeplummerjr/713a47d5fd63e8e189f8cf5cbc0649cd in brain.js 1.6.0+ for recurrent time step neural networks.

from brain.js.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.