Giter VIP home page Giter VIP logo

brain.js's Introduction

Logo

brain.js

GPU accelerated Neural networks in JavaScript for Browsers and Node.js

GitHub npm js-standard-style Backers on Open Collective Sponsors on Open Collective Gitter Slack CI codecov Twitter

NPM

About

brain.js is a GPU accelerated library for Neural Networks written in JavaScript.

💡 This is a continuation of the harthur/brain, which is not maintained anymore. More info

Table of Contents

Installation and Usage

NPM

If you can install brain.js with npm:

npm install brain.js

CDN

<script src="//unpkg.com/brain.js"></script>

Download

Download the latest brain.js for browser

Installation note

Brain.js depends on a native module headless-gl for GPU support. In most cases installing brain.js from npm should just work. However, if you run into problems, this means prebuilt binaries are not able to download from GitHub repositories and you might need to build it yourself.

Building from source

Please make sure the following dependencies are installed and up to date and then run:

npm rebuild
System dependencies
Mac OS X
Ubuntu/Debian
sudo apt-get install -y build-essential libglew-dev libglu1-mesa-dev libxi-dev pkg-config
Windows

* If you are using Build Tools 2017 then run npm config set msvs_version 2017 Note: This no longer works in modern versions of npm.

Examples

Here's an example showcasing how to approximate the XOR function using brain.js: more info on config here.

💡 A fun and practical introduction to Brain.js

// provide optional config object (or undefined). Defaults shown.
const config = {
  binaryThresh: 0.5,
  hiddenLayers: [3], // array of ints for the sizes of the hidden layers in the network
  activation: 'sigmoid', // supported activation types: ['sigmoid', 'relu', 'leaky-relu', 'tanh'],
  leakyReluAlpha: 0.01, // supported for activation type 'leaky-relu'
};

// create a simple feed-forward neural network with backpropagation
const net = new brain.NeuralNetwork(config);

net.train([
  { input: [0, 0], output: [0] },
  { input: [0, 1], output: [1] },
  { input: [1, 0], output: [1] },
  { input: [1, 1], output: [0] },
]);

const output = net.run([1, 0]); // [0.987]

or more info on config here.

// provide optional config object, defaults shown.
const config = {
  inputSize: 20,
  inputRange: 20,
  hiddenLayers: [20, 20],
  outputSize: 20,
  learningRate: 0.01,
  decayRate: 0.999,
};

// create a simple recurrent neural network
const net = new brain.recurrent.RNN(config);

net.train([
  { input: [0, 0], output: [0] },
  { input: [0, 1], output: [1] },
  { input: [1, 0], output: [1] },
  { input: [1, 1], output: [0] },
]);

let output = net.run([0, 0]); // [0]
output = net.run([0, 1]); // [1]
output = net.run([1, 0]); // [1]
output = net.run([1, 1]); // [0]

However, there is no reason to use a neural network to figure out XOR. (-: So, here is a more involved, realistic example: Demo: training a neural network to recognize color contrast.

More Examples

Brain.js Examples Repo

You can check out this fantastic screencast, which explains how to train a simple neural network using a real-world dataset: How to create a neural network in the browser using Brain.js.

Training

Use train() to train the network with an array of training data. The network has to be trained with all the data in bulk in one call to train(). More training patterns will probably take longer to train, but will usually result in a network better at classifying new patterns.

Note

Training is computationally expensive, so you should try to train the network offline (or on a Worker) and use the toFunction() or toJSON() options to plug the pre-trained network into your website.

Data format

For training with NeuralNetwork

Each training pattern should have an input and an output, both of which can be either an array of numbers from 0 to 1 or a hash of numbers from 0 to 1. For the color contrast demo it looks something like this:

const net = new brain.NeuralNetwork();

net.train([
  { input: { r: 0.03, g: 0.7, b: 0.5 }, output: { black: 1 } },
  { input: { r: 0.16, g: 0.09, b: 0.2 }, output: { white: 1 } },
  { input: { r: 0.5, g: 0.5, b: 1.0 }, output: { white: 1 } },
]);

const output = net.run({ r: 1, g: 0.4, b: 0 }); // { white: 0.99, black: 0.002 }

Here's another variation of the above example. (Note that input objects do not need to be similar.)

net.train([
  { input: { r: 0.03, g: 0.7 }, output: { black: 1 } },
  { input: { r: 0.16, b: 0.2 }, output: { white: 1 } },
  { input: { r: 0.5, g: 0.5, b: 1.0 }, output: { white: 1 } },
]);

const output = net.run({ r: 1, g: 0.4, b: 0 }); // { white: 0.81, black: 0.18 }

For training with RNNTimeStep, LSTMTimeStep and GRUTimeStep

Each training pattern can either:

  • Be an array of numbers
  • Be an array of arrays of numbers

Example using an array of numbers:

const net = new brain.recurrent.LSTMTimeStep();

net.train([[1, 2, 3]]);

const output = net.run([1, 2]); // 3

Example using an array of arrays of numbers:

const net = new brain.recurrent.LSTMTimeStep({
  inputSize: 2,
  hiddenLayers: [10],
  outputSize: 2,
});

net.train([
  [1, 3],
  [2, 2],
  [3, 1],
]);

const output = net.run([
  [1, 3],
  [2, 2],
]); // [3, 1]

For training with RNN, LSTM and GRU

Each training pattern can either:

  • Be an array of values
  • Be a string
  • Have an input and an output
    • Either of which can have an array of values or a string

CAUTION: When using an array of values, you can use ANY value, however, the values are represented in the neural network by a single input. So the more distinct values has the larger your input layer. If you have a hundreds, thousands, or millions of floating point values THIS IS NOT THE RIGHT CLASS FOR THE JOB. Also, when deviating from strings, this gets into beta

Example using direct strings: Hello World Using Brainjs

  const net = new brain.recurrent.LSTM();

  net.train(['I am brainjs, Hello World!']);

  const output = net.run('I am brainjs');
  alert(output);
const net = new brain.recurrent.LSTM();

net.train([
  'doe, a deer, a female deer',
  'ray, a drop of golden sun',
  'me, a name I call myself',
]);

const output = net.run('doe'); // ', a deer, a female deer'

Example using strings with inputs and outputs:

const net = new brain.recurrent.LSTM();

net.train([
  { input: 'I feel great about the world!', output: 'happy' },
  { input: 'The world is a terrible place!', output: 'sad' },
]);

const output = net.run('I feel great about the world!'); // 'happy'

Training Options

train() takes a hash of options as its second argument:

net.train(data, {
  // Defaults values --> expected validation
  iterations: 20000, // the maximum times to iterate the training data --> number greater than 0
  errorThresh: 0.005, // the acceptable error percentage from training data --> number between 0 and 1
  log: false, // true to use console.log, when a function is supplied it is used --> Either true or a function
  logPeriod: 10, // iterations between logging out --> number greater than 0
  learningRate: 0.3, // scales with delta to effect training rate --> number between 0 and 1
  momentum: 0.1, // scales with next layer's change value --> number between 0 and 1
  callback: null, // a periodic call back that can be triggered while training --> null or function
  callbackPeriod: 10, // the number of iterations through the training data between callback calls --> number greater than 0
  timeout: number, // the max number of milliseconds to train for --> number greater than 0. Default --> Infinity
});

The network will stop training whenever one of the two criteria is met: the training error has gone below the threshold (default 0.005), or the max number of iterations (default 20000) has been reached.

By default, training will not let you know how it's doing until the end, but set log to true to get periodic updates on the current training error of the network. The training error should decrease every time. The updates will be printed to the console. If you set log to a function, this function will be called with the updates instead of printing to the console. However, if you want to use the values of the updates in your own output, the callback can be set to a function to do so instead.

The learning rate is a parameter that influences how quickly the network trains. It's a number from 0 to 1. If the learning rate is close to 0, it will take longer to train. If the learning rate is closer to 1, it will train faster, but training results may be constrained to a local minimum and perform badly on new data.(Overfitting) The default learning rate is 0.3.

The momentum is similar to learning rate, expecting a value from 0 to 1 as well, but it is multiplied against the next level's change value. The default value is 0.1

Any of these training options can be passed into the constructor or passed into the updateTrainingOptions(opts) method and they will be saved on the network and used during the training time. If you save your network to json, these training options are saved and restored as well (except for callback and log, callback will be forgotten and log will be restored using console.log).

A boolean property called invalidTrainOptsShouldThrow is set to true by default. While the option is true, if you enter a training option that is outside the normal range, an error will be thrown with a message about the abnormal option. When the option is set to false, no error will be sent, but a message will still be sent to console.warn with the related information.

Async Training

trainAsync() takes the same arguments as train (data and options). Instead of returning the results object from training, it returns a promise that when resolved will return the training results object. Does NOT work with:

  • brain.recurrent.RNN
  • brain.recurrent.GRU
  • brain.recurrent.LSTM
  • brain.recurrent.RNNTimeStep
  • brain.recurrent.GRUTimeStep
  • brain.recurrent.LSTMTimeStep
const net = new brain.NeuralNetwork();
net
  .trainAsync(data, options)
  .then((res) => {
    // do something with my trained network
  })
  .catch(handleError);

With multiple networks you can train in parallel like this:

const net = new brain.NeuralNetwork();
const net2 = new brain.NeuralNetwork();

const p1 = net.trainAsync(data, options);
const p2 = net2.trainAsync(data, options);

Promise.all([p1, p2])
  .then((values) => {
    const res = values[0];
    const res2 = values[1];
    console.log(
      `net trained in ${res.iterations} and net2 trained in ${res2.iterations}`
    );
    // do something super cool with my 2 trained networks
  })
  .catch(handleError);

Cross Validation

Cross Validation can provide a less fragile way of training on larger data sets. The brain.js api provides Cross Validation in this example:

const crossValidate = new brain.CrossValidate(() => new brain.NeuralNetwork(networkOptions));
crossValidate.train(data, trainingOptions, k); //note k (or KFolds) is optional
const json = crossValidate.toJSON(); // all stats in json as well as neural networks
const net = crossValidate.toNeuralNetwork(); // get top performing net out of `crossValidate`

// optionally later
const json = crossValidate.toJSON();
const net = crossValidate.fromJSON(json);

Use CrossValidate with these classes:

  • brain.NeuralNetwork
  • brain.RNNTimeStep
  • brain.LSTMTimeStep
  • brain.GRUTimeStep

An example of using cross validate can be found in cross-validate.ts

Methods

train(trainingData) -> trainingStatus

The output of train() is a hash of information about how the training went:

{
  error: 0.0039139985510105032,  // training error
  iterations: 406                // training iterations
}

run(input) -> prediction

Supported on classes:

  • brain.NeuralNetwork
  • brain.NeuralNetworkGPU -> All the functionality of brain.NeuralNetwork but, ran on GPU (via gpu.js in WebGL2, WebGL1, or fallback to CPU)
  • brain.recurrent.RNN
  • brain.recurrent.LSTM
  • brain.recurrent.GRU
  • brain.recurrent.RNNTimeStep
  • brain.recurrent.LSTMTimeStep
  • brain.recurrent.GRUTimeStep

Example:

// feed forward
const net = new brain.NeuralNetwork();
net.fromJSON(json);
net.run(input);

// time step
const net = new brain.LSTMTimeStep();
net.fromJSON(json);
net.run(input);

// recurrent
const net = new brain.LSTM();
net.fromJSON(json);
net.run(input);

forecast(input, count) -> predictions

Available with the following classes. Outputs a array of predictions. Predictions being a continuation of the inputs.

  • brain.recurrent.RNNTimeStep
  • brain.recurrent.LSTMTimeStep
  • brain.recurrent.GRUTimeStep

Example:

const net = new brain.LSTMTimeStep();
net.fromJSON(json);
net.forecast(input, 3);

toJSON() -> json

Serialize neural network to json

fromJSON(json)

Deserialize neural network from json

Failing

If the network failed to train, the error will be above the error threshold. This could happen if the training data is too noisy (most likely), the network does not have enough hidden layers or nodes to handle the complexity of the data, or it has not been trained for enough iterations.

If the training error is still something huge like 0.4 after 20000 iterations, it's a good sign that the network can't make sense of the given data.

RNN, LSTM, or GRU Output too short or too long

The instance of the net's property maxPredictionLength (default 100) can be set to adjust the output of the net;

Example:

const net = new brain.recurrent.LSTM();

// later in code, after training on a few novels, write me a new one!
net.maxPredictionLength = 1000000000; // Be careful!
net.run('Once upon a time');

JSON

Serialize or load in the state of a trained network with JSON:

const json = net.toJSON();
net.fromJSON(json);

Standalone Function

You can also get a custom standalone function from a trained network that acts just like run():

const run = net.toFunction();
const output = run({ r: 1, g: 0.4, b: 0 });
console.log(run.toString()); // copy and paste! no need to import brain.js

Options

NeuralNetwork() takes a hash of options:

const net = new brain.NeuralNetwork({
  activation: 'sigmoid', // activation function
  hiddenLayers: [4],
  learningRate: 0.6, // global learning rate, useful when training using streams
});

activation

This parameter lets you specify which activation function your neural network should use. There are currently four supported activation functions, sigmoid being the default:

Here's a table (thanks, Wikipedia!) summarizing a plethora of activation functions — Activation Function

hiddenLayers

You can use this to specify the number of hidden layers in the network and the size of each layer. For example, if you want two hidden layers - the first with 3 nodes and the second with 4 nodes, you'd give:

hiddenLayers: [3, 4];

By default brain.js uses one hidden layer with size proportionate to the size of the input array.

Streams

Use https://www.npmjs.com/package/train-stream to stream data to a NeuralNetwork

Utilities

likely

const likely = require('brain/likely');
const key = likely(input, net);

Likely example see: simple letter detection

toSVG

<script src="../../src/utilities/svg.js"></script>

Renders the network topology of a feedforward network

document.getElementById('result').innerHTML = brain.utilities.toSVG(
  network,
  options
);

toSVG example see: network rendering

The user interface used: screenshot1

Neural Network Types

Why different Neural Network Types

Different neural nets do different things well. For example:

  • A Feedforward Neural Network can classify simple things very well, but it has no memory of previous actions and has infinite variation of results.
  • A Time Step Recurrent Neural Network remembers, and can predict future values.
  • A Recurrent Neural Network remembers, and has a finite set of results.

Get Involved

W3C machine learning standardization process

If you are a developer or if you just care about how ML API should look like - please take a part and join W3C community and share your opinions or simply support opinions you like or agree with.

Brain.js is a widely adopted open source machine learning library in the javascript world. There are several reasons for it, but most notable is simplicity of usage while not sacrificing performance. We would like to keep it also simple to learn, simple to use and performant when it comes to W3C standard. We think that current brain.js API is quite close to what we could expect to become a standard. And since supporting doesn't require much effort and still can make a huge difference feel free to join W3C community group and support us with brain.js like API.

Get involved into W3C machine learning ongoing standardization process here. You can also join our open discussion about standardization here.

Issues

If you have an issue, either a bug or a feature you think would benefit your project let us know and we will do our best.

Create issues here and follow the template.

brain.js.org

Source for brain.js.org is available at Brain.js.org Repository. Built using awesome vue.js & bulma. Contributions are always welcome.

Contributors

This project exists thanks to all the people who contribute. [Contribute].

Backers

Thank you to all our backers! 🙏 [Become a backer]

Sponsors

Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [Become a sponsor]

brain.js's People

Contributors

abhisheksoni27 avatar adyyoung avatar cclauss avatar creativecactus avatar daffl avatar danielmazurkiewicz avatar dependabot[bot] avatar dhairyagada avatar eliamartani avatar freddyc avatar geokontop avatar harthur avatar ionicabizau avatar jcmais avatar ken0x0a avatar leandrigues avatar loredanacirstea avatar luansilveirasouza avatar m-ahmadi avatar massaroni avatar mubaidr avatar mujadded avatar nabeelvalley avatar nelias avatar ofirgeller avatar pejrak avatar perkyguy avatar robertleeplummerjr avatar stonet2000 avatar theunlocked avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

brain.js's Issues

npm install failing

I get errors when running npm install:

  1. esprima-six is missing
  2. canvas does not install

`npm ERR! Linux 3.13.0-37-generic
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "install" "browserify"
npm ERR! node v5.10.1
npm ERR! npm v3.8.3
npm ERR! code E404

npm ERR! 404 Registry returned 404 for GET on https://registry.npmjs.org/esprima-six
npm ERR! 404
npm ERR! 404 'esprima-six' is not in the npm registry.
npm ERR! 404 You should bug the author to publish it (or use the name yourself!)
npm ERR! 404 It was specified as a dependency of 'derequire'
npm ERR! 404
npm ERR! 404 Note that you can also install from a
npm ERR! 404 tarball, folder, http url, or git url.

npm ERR! Please include the following file with any support request:
npm ERR! /var/www/brain.js/npm-debug.log
`

predicting memory requirements?

I'm fairly new to implementing neural networks and am using this library for the first time. I'm doing text processing and so far I'm up to 5.6 GB of RAM required on a simple bigram.

The way I'm structuring the training data is like this:
{ input: {[inputWord]: 1}, output: {[outputWord]: 1 } }

so if it were "cat sat" the training data would look like:
{ input: {cat: 1}, output: {sat: 1} }

I have two questions:

  1. what's a good way to predict memory requirements? Is there a back-of-napkin big-O?
  2. is there a better way to structure the data in this particular case?

Thanks!

Different activation functions

Hey,

first of, cool work :-)! As far as I see it brain.js is currently using a sigmoid/logistic activation function. I think it would be quite beneficial to.

  1. Document that this one is used
  2. Create different ones

I could implement some of them like tanH which are working in the range of (0,1) which is currently used as output.

Need for opinions and suggestions from experienced developers

In the past I have used Brain.js and some other JavaScript based machine learning libraries but unfortunately I have found them not matching my needs. That's why, working on my personal projects, I have developed the idea of creating a different approach by myself. So, sorry to be here to talk about another project but I would really like to receive some opinions and suggestions from experienced people that, being into this specific field, could help me to set useful and shared expectations. By the way, thanks for the great support given to Brain.js so far. The library is called DN2A and is on https://github.com/dn2a/dn2a-javascript

Cross-validation tests failing

fails on multiple accounts:
`> mocha test/cross-validation --timeout 10000

(node) child_process: options.customFds option is deprecated. Use options.stdio instead.

OCR cross-validation
Cross validating
1) recognize characters in different fonts

0 passing (563ms)
1 failing

  1. OCR cross-validation recognize characters in different fonts:
    TypeError: Cannot read property 'learningRate' of undefined
    at testPartition (/var/www/brain.js/lib/cross-validate.js:21:28)
    at /var/www/brain.js/lib/cross-validate.js:60:18
    at Array.map (native)
    at crossValidate (/var/www/brain.js/lib/cross-validate.js:55:28)
    at Context. (/var/www/brain.js/test/cross-validation/ocr.js:63:18)
    at callFn (/var/www/brain.js/node_modules/mocha/lib/runnable.js:250:21)
    at Test.Runnable.run (/var/www/brain.js/node_modules/mocha/lib/runnable.js:243:7)
    at Runner.runTest (/var/www/brain.js/node_modules/mocha/lib/runner.js:373:10)
    at /var/www/brain.js/node_modules/mocha/lib/runner.js:451:12
    at next (/var/www/brain.js/node_modules/mocha/lib/runner.js:298:14)
    at /var/www/brain.js/node_modules/mocha/lib/runner.js:308:7
    at next (/var/www/brain.js/node_modules/mocha/lib/runner.js:246:23)
    at Immediate._onImmediate (/var/www/brain.js/node_modules/mocha/lib/runner.js:275:5)
    at tryOnImmediate (timers.js:534:15)
    at processImmediate as _immediateCallback
    `
  • when added trainOpts, this is the output err:

`Training iterations per second: 45.87900933820544
1) recognize characters in different fonts

0 passing (3s)
1 failing

  1. OCR cross-validation recognize characters in different fonts:
    ReferenceError: assert is not defined
    at Context. (/var/www/brain.js/test/cross-validation/ocr.js:84:5)
    at callFn (/var/www/brain.js/node_modules/mocha/lib/runnable.js:250:21)
    at Test.Runnable.run (/var/www/brain.js/node_modules/mocha/lib/runnable.js:243:7)
    at Runner.runTest (/var/www/brain.js/node_modules/mocha/lib/runner.js:373:10)
    at /var/www/brain.js/node_modules/mocha/lib/runner.js:451:12
    at next (/var/www/brain.js/node_modules/mocha/lib/runner.js:298:14)
    at /var/www/brain.js/node_modules/mocha/lib/runner.js:308:7
    at next (/var/www/brain.js/node_modules/mocha/lib/runner.js:246:23)
    at Immediate._onImmediate (/var/www/brain.js/node_modules/mocha/lib/runner.js:275:5)
    at tryOnImmediate (timers.js:534:15)
    at processImmediate as _immediateCallback
    `

create a "game changer" of a project that uses brain.js

I see a lot of usage of neural nets to do things like play games, or detect cats, or make crazy cool images. I think those are valuable, but what about something that could save someone's life?

I propose a project that we feed xray images, and it can detect cancer.

Feedback welcome.

Option hiddenLayers. Question about readme

In readme.md:

By default brain.js uses one hidden layer with size proportionate to the size of the input array.

This mean like count input data dimension equal to one hidden layer.
Mm.. sorry for my english. I talk for example, if input array dimension to 100 than and hidden layer will contain 100 neurons.

But in sources of function train() write:
sizes.push(Math.max(3, Math.floor(inputSize / 2)));

What right?

And what you recommend about create custom sizes of hidden layers?

RNN or so to predict mouse motion

Hi,

Not an issue per say.
It would be very nice to have a simple RNN or so to predict mouse motion.
Did someone see/did such example in a browser?
I have seen such example in c (don't have the reference now).

I don't look for a fastest or smartest example. Just one that work ;-)

Thanks.

Strings?

Hey
Would like to know if the inputs objects support text strings.

I did something like this

var brain = require('brain.js')
var net = new brain.NeuralNetwork();

net.train([
           {input: "my unit-tests failed.", output: "software"},
           {input: "tried the program, but it was buggy.", output: "software"},
           {input: "i need a new power supply.", output: "hardware"},
           {input: "the drive has a 2TB capacity.", output: "hardware"}
         ]);

It outputs { error: NaN, iterations: 1 }

One of the comments says that rnn supports them, so is it only available for it or other models can also make use of strings?

Showcases

We could add a section with showcases to the readme to provide usage examples and inspirations.
Due to the response of the community I think there might plenty of better-written projects than mine.

I wrote a classifier module that is able to perform multilabel classification on new oservations. It has to be trained with the target-classes alone.
https://github.com/FranzSkuffka/flextractor/tree/master/lib/classifier

What are your thoughts? What did you build already?

Generate possible results from training

Is there a way to train the brain and generate possible results instead of letting the brain analyse a certain input?

For example:
"Give me 4 colors which match 'orange' with at least 80% accuracy".

As pseudo-code:

var net = new NeuralNetwork(); //Create neural network

net.train([{input: {r:1, g:0.65, b:0},  output: {orange: 1}}, //This is orange
           {input: {r:0, g:0.54, b:0},  output: {green: 1}}, //This is green
           {input: {r:0.6, g:1, b:0.5}, output: {green: 1}}, //This is also green
           {input: {r:0.67, g:0, b:1},  output: {purple: 1}}]); //This is purple

var output = net.run({"orange": ">0.8", "results": 4}); //return 4 colors which match 'orange' with at least 80% accuracy

Pseudo-output:

[{r:1,    g:0.65, b:0},
 {r:0.98, g:0.55, b:0},
 {r:1,    g:0.55, b:0.2},
 {r:0.85, g:0.55, b:0}]

A real world example would be:
"Tell me the first thing which comes to your mind when I say Internet". You might say something like Tim Berners-Lee or Wikipedia.

Is something like that possible with BrainJS?
Note: I also posted this question on StackOverflow.

es6 it!

I'd like to update the syntax to that of es6, thoughts?

NPM version?

NPM has a different version? 0.7.0

New release?

Time series

Any chance of extending this service to support time series predictions?

How run train method multiple times for one net?

var net = new brain.default.NeuralNetwork();

var iter = 0;
walk.walkSync('./data', function(basedir, filename, stat) {
	var trainingData = require('./data/'+filename);
	
	var trainResult = net.train(
		trainingData, {
			initialization: (iter === 0),
			keepNetworkIntact: (iter > 0)
		}
	);
	
	iter++;
});

I expect that this may train nn on different files with trainig data, but in result nn I had nn what trained for first file with trainig data. As if all new runs train not saved, but busy CPU and time.

What I need to do for this?

P.S. I do fs walk because trainig data very large - 2GB.

Flexible store

Currently this works in memory. Would it be a good idea to extend this to work with any store (e.g. a database)?

That means anyone can build a custom store object and expose the methods to get and set data.

Pass current net to callback function

neural-network.js#L171

if (callback && (i % callbackPeriod == 0)) {
    callback({ error: error, iterations: i });
}

Is it possible – send current net state to callback function?

Let me explain.
I divide my training data to real training data and test data.
And I want run nn on test data in some period for look real error.

Code for example what I want:

if (callback && (i % callbackPeriod == 0)) {
    callback({ error: error, iterations: i }, this);
}

Likely may need to go into a utility lib

This is an issue raised by @nickpoorman.

I've been trying to think of a concept where I can make the likely method not attached to the neural network. Currently it is:

var brain  = require('brain')
  , net    = new brain.NeuralNetwork
  ;
net.train([]);

var result = brain.likely([]);

Are you suggesting something like:

var brain  = require('brain')
  , likely = require('lib/likely')
  , net    = new brain.NeuralNetwork
  ;
net.train([]);

var result = likely(net, []);

?

Steam-example write(null) stream error node v6.9.2

I recently cloned the repository and went to run the stream example and got an error that the stream write could not receive a null value. I suspect it's because streams have changed in that last few node iterations.

Any suggestions on a fix? I seems that taking out the write(null) from stream-example fixes the error but then it doesn't fire the statistics of the stream iterations or provide a toJSON output.

Will work on more details when on computer.

brainstorm how to get objects into brain.js recurrent neural network

The standard network uses:

[
  {input: { r: 0.03, g: 0.7, b: 0.5 }, output: { black: 1 }},
  {input: { r: 0.16, g: 0.09, b: 0.2 }, output: { white: 1 }},
  {input: { r: 0.5, g: 0.5, b: 1.0 }, output: { white: 1 }}
]

for training. Would it be possible to get something like that into the recurrent neural network? If so, how?

get browser version running

Would also be nice while doing this to consider where we absolutely need third party plugins. They are used occasionally, but nothing seemingly to complex for native js to handle.

How normalize discrete data input

Hi!
How can I read readme.md, I need normalize input data to range 0..1.
That's ok with linear data, same as color, price, year, etc.

But look at my example:
Form in my web page have one field with radio button, like:

  • credit;
  • debet;
  • personal card;
  • other.

How I need provide this data for train nn?
Ok, I can send something like:

  • type__credit: 0;
  • type__debet: 0;
  • type__personal_card: 1;
  • type__other: 0.

But this make more input columns, even for many fields/values, as i showed.
Then I thought, what will be if I will make this:

  • for «credit»: type=0.25;
  • for «debet»: type=0.5;
  • for «personal card»: type=0.75;
  • for «other»: type=1.

That's save me from many columns, but will be work?

And what I can do for checkboxes? Maybe:

var typeForInputBrain = 0;

typeForInputBrain += (typeFromForm.indexOf('credit')) ? 0.04 : 0;
typeForInputBrain += (typeFromForm.indexOf('debet')) ? 0.08 : 0;
typeForInputBrain += (typeFromForm.indexOf('personal card')) ? 0.16 : 0;
typeForInputBrain += (typeFromForm.indexOf('other')) ? 0.32 : 0;

Then even if all values was checked input value save in range between 0 to 1.
Ok? Is it working method, or, that's way require something new activation function?

P.S. Sorry for my english :)

Accuracy

var net = new brain.NeuralNetwork();

var trainingSet = [
  {
    input: [0.1, 0.1],
    output: [0.2]
  },
  {
    input: [0.2, 0.2],
    output: [0.4]
  },
  {
    input: [0.5, 0.1],
    output: [0.6]
  },
  {
    input: [0, 0.1],
    output: [0.1]
  },
  {
    input: [0.3, 0.1],
    output: [0.4]
  }
];

console.log(net.train(trainingSet, { // Object {error: 0.000099992701244754, iterations: 15197}
  errorThresh: 0.0001,
  learningRate: 0.3
}));

console.log(net.run([0.2, 0.4])); // [0.5480908025594973]
console.log(net.run([0.1, 0.4])); // [0.45124782633077687]
console.log(net.run([0.3, 0.4])); // [0.6239448854007088]
console.log(net.run([0.5, 0.4])); // [0.721436435561435]
console.log(net.run([0, 0.4])); // [0.3393225809755128]

I've tried a few different approaches and all sorts of input/outputs (and another library actually).

I really can't get accurate data out. Any tips?

Consider merging in all the PRs submitted to harthur's original brainjs

First of all, so excited to see someone doing this! Having maintained several open source projects myself, I completely understand why harthur would decide to move on from spending her time maintaining these. But her projects are so useful that they shouldn't be left to languish. Very happy to see the community picking up the torch!

One suggestion I have (if you haven't done it already) is to merge in all the PRs that were submitted to harthur's original brain repo. It should be pretty doable with just a little bit of git fu. They wouldn't show up as PRs that were merged in, just external branches that were added to master, but I think that's ok.

Or another alternative, you could also just comment on all the old PRs and ask the original authors to resubmit them to the harthur-org repo.

Either way, thanks for picking up the torch here! Glad to see actively supported machine learning in JavaScript :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.