Giter VIP home page Giter VIP logo

neataptic's People

Contributors

agamm avatar alaa-eddine avatar alexistm avatar anubisthejackle avatar bcbcb avatar bobalazek avatar bondifrench avatar cazala avatar coleww avatar cristy94 avatar d-nice avatar eugenioclrc avatar everyonedoteu avatar filet-mign0n avatar germ13 avatar jabher avatar jakeprasad avatar jmussett avatar jzjzjzj avatar lucasbertola avatar menduz avatar mkondel avatar oritwoen avatar qard avatar rwieruch avatar sleepwalking avatar vkaracic avatar wagenaartje avatar yogiben avatar zonetti avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

neataptic's Issues

Methods.Mutation.SUB_GATE causes error

Causes following error every now and then:

Uncaught TypeError: Cannot read property 'ungate' of null

This code will create that error sometimes:

var x = new Architect.Perceptron(3, 10, 3); 
for(var i = 0; i < 10000; i++){  
  var keys = Object.keys(Methods.Mutation); x.mutate(Methods.Mutation[keys[ keys.length * Math.random() << 0]])
}; 
x.activate([1,0,1]);

Will be investigated!

UnhandledPromiseRejectionWarning: Unhandled promise rejection

Running version 1.3.0, the XOR evolve example has the error:

(node:13051) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): TypeError: Cannot read property 'FFW' of undefined
(node:13051) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

Seems like setting the Methods.mutation.FFW property fails. Has the method been moved or deleted?

[dev] GRU network

So I'm trying to implement a GRU network. I have tried this multiple times in the past, but I thought I'd really do a real try now. So this is my current implementation for a Gated Recurrent Unit with size 1:

/** Inverse activation function for forget gate */
Methods.Activation.INVERSE =  function(x, derivate){
  if(derivate)
    return -1;
  return 1 - x;
}

var SIZE = 1;

var input = new Group(SIZE);
var previousInput = new Group(SIZE);
var updateGate = new Group(SIZE);
var inverseUpdateGate = new Group(SIZE);
var resetGate = new Group(SIZE);
var memoryCell = new Group(SIZE);
var output = new Group(SIZE);

previousInput.set({ bias: 0, squash: Methods.Activation.IDENTITY, type: 'constant'});
memoryCell.set({ squash: Methods.Activation.TANH });
inverseUpdateGate.set({ bias: 0, squash: Methods.Activation.INVERSE, type: 'constant'});
updateGate.set({ bias: 1 });
resetGate.set({ bias: 1 });

// Previous input calculation
input.connect(previousInput, Methods.Connection.ONE_TO_ONE, 1);

// Update gate calculation
input.connect(updateGate);
previousInput.connect(updateGate);

// Inverse update gate calculation
updateGate.connect(inverseUpdateGate, Methods.Connection.ONE_TO_ONE, 1);

// Reset gate calculation
input.connect(resetGate);
previousInput.connect(resetGate);

// Memory calculation
input.connect(memoryCell);
var reset = previousInput.connect(memoryCell);

resetGate.gate(reset, Methods.Gating.OUTPUT); // gate

// Output calculation
var update1 = previousInput.connect(output);
var update2 = memoryCell.connect(output);

updateGate.gate(update1, Methods.Gating.OUTPUT);
inverseUpdateGate.gate(update2, Methods.Gating.OUTPUT);

var nodes = [input, updateGate, inverseUpdateGate, resetGate, memoryCell, previousInput, output];
var network = Architect.Construct(nodes);

However, when trying to train data, it seems to never converge:

var trainingSet = [
  { input: [0], output: [1.0] },
  { input: [0], output: [0.9] },
  { input: [0], output: [0.7] },
  { input: [0], output: [0.4] },
  { input: [0], output: [0.0] }
];


// Train a sequence: 00100100..
network.train(trainingSet, {
  log: 1000,
  rate: 0.3,
  clear: true,
  iterations: 50000
});

the above example will never converge.

So I need some help, I've set up a JSFiddle here for you to run. By the way, this is the architecture of a Gated Recurrent Network:

Please keep in mind that (1-zt) and (zt) can be switched freely ( they are just inverses of each other ).

Add LSTM options info to wiki

I would of done a pull request but you can't do pull requests to the wiki unfortunately. Wiki under Architects LSTM.

// more wiki...
Also you can set many layers of memory blocks:

var myLSTM = new Architect.LSTM(2,4,4,4,1);

You can pass options if desired like so:

var options = {
  memoryToMemory: false,      // default is false
  outputToMemory: false,      // default is false
  outputToGates: false,       // default is false
  inputToOutput: true         // default is true
};
var myLSTM = new Architect.LSTM(2,4,4,4,1,options);

// more wiki...

Cannot resolve 'child_process'

./~/neataptic/src/multithreading/workers/node/testworker.js
Module not found: Error: Can't resolve 'child_process' in '[...]/node_modules/neataptic/src/multithreading/workers/node'
 @ ./~/neataptic/src/multithreading/workers/node/testworker.js 7:9-33
 @ ./~/neataptic/src/multithreading/workers/workers.js
 @ ./~/neataptic/src/multithreading/multi.js
 @ ./~/neataptic/src/neataptic.js

Node require

I am sorry, but I don't get Neataptic running with node:
Console
npm install neataptic
app.js

var neataptic = require('neataptic');

Error

SyntaxError: Unexpected token function
c:\neat\node_modules\neataptic\src\architecture\network.js:831
  evolve: async function (set, options) {
                ^^^^^^^^
SyntaxError: Unexpected token function
    at createScript (vm.js:56:10)
    at Object.runInThisContext (vm.js:97:10)
    at Module._compile (module.js:542:28)
    at Object.Module._extensions..js (module.js:579:10)
    at Module.load (module.js:487:32)
    at tryModuleLoad (module.js:446:12)
    at Function.Module._load (module.js:438:3)
    at Module.require (module.js:497:17)
    at require (internal/module.js:20:19)
    at Object.<anonymous> (c:\neat\node_modules\neataptic\src\architecture\architect.js:3:15)

network.js:831

  evolve: async function (set, options) {

'Keras-like' model creation

Recently, I brought up an idea in this comment: #12 (comment). I think the way custom networks can be created still requires some knowledge about network architectures, which may be too complicated for some. I want to revamp model creation, making it more layered like synaptic, but with extra flexibility.

My current idea of network creation:

// Create a network
var model = new Model();
model.add(Dense(10)); // 10 inputs
model.add(LSTM(5, { activation: Activation.RELU }));
model.add(NARX(10));
model.add(Dense(3, { activation: Activation.TANH})); // 3 outputs

// To be discussed, adding custom (recurrent) connections
model.layers[0].connect(model.layers[2]);
model.layers[3].connect(model.layers[1]);

var network = model.compile();
network.train(...

Some basic points:

  • A model should always start and end with a Dense() layer.
  • Optional arguments can be specified in a dictionary, e.g. activation, bias, weight initialisation, connection type to next layer (e.g. ALL_TO_ALL, ONE_TO_ONE)
  • After setting up the layers, the network must be compiled. This turns the network in array of nodes, make it more susceptible to genetic algorithms and optimization
  • The training function will remain the same as now

Things to discuss:

  • What is a good way to allow custom connections? Check out the code above about what my first idea is on how to do this

Layer types I want to embed:

  • LSTM
  • NARX
  • Dense
  • Random
  • GRU
  • Clock
  • Softmax
  • Convolution
  • Pooling

Crash "Cannot read property 'connect' of undefined"

On the latest version (1.2.21) I got a crash. This was in the console:

New generation.
Generation: 3560 - average score: -725.144
Fittest score: -258.686, Best genome, nodes length: 8, nodes connections: 13
\node_modules\neataptic\src\network.js:107
    var connections = from.connect(to, weight);
                          ^
TypeError: Cannot read property 'connect' of undefined
    at Network.connect (\node_modules\neataptic\src\network.js:107:27)
    at Function.Network.crossOver (\node_modules\neataptic\src\network.js:1059:29)
    at Neat.getOffspring (\node_modules\neataptic\src\neat.js:107:21)
    at endEvaluation (\app.js:121:33)
    at learnDatra (\app.js:243:9)
    at Parser.fs.createReadStream.pipe.on.on (\app.js:186:13)
    at emitNone (events.js:110:20)
    at Parser.emit (events.js:207:7)
    at endReadableNT (_stream_readable.js:1045:12)
    at _combinedTickCallback (internal/process/next_tick.js:102:11)
Waiting for the debugger to disconnect...

Error: Dataset input/output size should be same as network input/output size!

Apologies if I'm making a silly mistake somewhere,

I'm using the following network generation settings for a NARX network:

var settings = {
    inputSize: 4,
    hiddenLayers: 5,
    outputSizes: 1,
    previousInput: 3,
    previousOutput: 3
};

With the following test data:

  { input: [ 0.0199, 0.01956, 0.01976, 0.01963 ], output: [ 1 ] },
// About 400 in length

Any help/direction would be appreciated

Network.fromJSON sending a null Node

managed to capture this issue when running my Neat with all the mutations

capture
capture2

on the second image one of the arrays has an index of -1 which causes it to send a null node and then not being able to Gate it on the first image.

is this the issue with the crossover that makes networks "infertile"? if so how could you provide an example on your wiki how to use the new functionality with the LSTM. really eager to try and use the LSTM

LSTM example not working

Link in LSTM docs has no output and following errors in console

neataptic.js Failed to load resource: the server responded with a status of 404 (Not Found)
fiddle.jshell.net/wagenaartje/k23zbf0f/1/show/:54 Uncaught ReferenceError: neataptic is not defined

Improve performance

@wagenaartje I am opening this issue as a discussion regarding performance optimization. In Endless repeat 'no more nodes left to remove!', you had said:

(PS: i'm working on speeding up evolution drastically using webworkers)

If you recall, I emailed you earlier regarding my efforts to widely distribute the evolution process, so it may be a good idea to collaborate.

I've been experimenting with Azure Functions and AWS Lambda, which both allow you to execute JavaScript code. An implementation I am experimenting with would turn the fitness function into a Promise, and that promise can be resolved by any means, including web workers, child processes, or API calls.

The problem is, once one thing is a promise, everything must be a promise. Currently, evolve() and all its dependents are synchronous, so implementing that change will drastically effect the interface. Despite that, I strongly believe Promises to be the best practice for this scenario. It's a widely accepted pattern that works seamlessly with libraries like Bluebird and RXJS, leaving the actual implementation up to the user.

NEAT wiki page needs an update

I'm trying to get my head around doing a more advanced example in NEAT.

I have been looking at the source code of the agario-ai example. I see it using the method neat.sort() which the wiki page says "Should not be called manually". It also uses methods "neat.mutate()", "neat.getOffspring()" and "neat.population" which are not listed in the wiki page.

For advance examples should we be going down the agario-ai path or should we be using the fitness function?
https://github.com/wagenaartje/neataptic/wiki/NEAT

Multithreading issue

For some reason, results from testing a dataset on a network with a worker might differ than just using the Network.protoype.test() . Working on the issue.

For now, using multiple threads is discouraged.

No group connection specified, using ALL_TO_ALL

Following code:

const neataptic = require('neataptic');

const Architect = neataptic.Architect;

const lstmOptions = {
    hiddenToHidden: false,
    outputToHidden: false,
    outputToGates: false,
    inputToOutput: true,
};
const lstm = new Architect.LSTM(1, 4, 4, 4, 1, lstmOptions);

const trainSet = [
    { input: [0], output: [0.1] },
    { input: [1], output: [0.2] },
    { input: [0], output: [0.3] },
    { input: [1], output: [0.4] },
    { input: [0], output: [0.5] },
];
const trainOptions = {
    rate: 0.2,
    iterations: 10000,
    error: 0.005,
    cost: null,
    crossValidate: null,
};
const trainResults = lstm.train(trainSet, trainOptions);
console.log(trainResults);

const testResults = [];
testResults[0] = lstm.activate([0]);
testResults[1] = lstm.activate([1]);
testResults[2] = lstm.activate([0]);
testResults[3] = lstm.activate([1]);
testResults[4] = lstm.activate([0]);
console.log(testResults);

Produces the following output:

No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ONE_TO_ONE
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ONE_TO_ONE
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ONE_TO_ONE
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
No group connection specified, using ALL_TO_ALL
{ error: 0.004988136102436419, iterations: 1610, time: 966 }
[ [ 0.18006586813613865 ],
  [ 0.28990322150410275 ],
  [ 0.3295774971285735 ],
  [ 0.37555477688446837 ],
  [ 0.4440169668335829 ] ]

I'm guessing this has to do with the initial configuration. I also understand that the connection type is specified when creating connections https://github.com/wagenaartje/neataptic/wiki/Connection. Perhaps it should not be displayed if the user intends to use the default.

Error: Cannot find module './neataptic' in worker.js

Not sure if I'm requiring the neataptic module correctly.

I fixed this by changing the line in neataptic/src/multithreading/workers/node/worker.js from:

var { multi, methods } = require('./neataptic');

to

var { multi, methods } = require('../../../neataptic');

Broken links in readme.

The examples have broken links. You added an extra character at the end ')'.

Agar.io-AI (supervised)
Target seeking AI (unsupervised)
Crossover playground

Modify weights of selfconnects

At this moment, the MOD_WEIGHT mutation looks like this:

var connection = this.connections[Math.floor(Math.random() * this.connections.length)];
var modification = Math.random() * (method.max - method.min) + method.min;
connection.weight += modification;

Selfconnections should also get a chance of weight mutation.

Outputs are out of the [0, 1] range

Hey again,

I was testing the neuroevolution playground and I got some outputs out of bounds (out of [0, 1]).

Output of the neuroevolution example 2:

Input: [0], wanted output: [0], actual: [-0.026]
Input: [0], wanted output: [0], actual: [-0.02]
Input: [0], wanted output: [0], actual: [-0.013]
Input: [0], wanted output: [0], actual: [-0.006]
Input: [0], wanted output: [0], actual: [-0.001]
Input: [0], wanted output: [0], actual: [0.002]
Input: [0], wanted output: [0], actual: [0.004]
Input: [0], wanted output: [0], actual: [0.004]
Input: [0], wanted output: [0], actual: [0.014]
Input: [0], wanted output: [1], actual: [0.979]

Average generation score greater than fittest

Haven't had this occur yet until now, happened with version 1.2.6.

Generation: 0 - average score: 17419624
Fittest score: 4354906

To give reference as to what I'm doing here; I've imported an existing fittest genome from a JSON file, and provided it as the basis for the NEAT network. The subsequent generations are fine, with the fittest being above the average, albeit the average scores are an order of magnitude lower.

Inconsistency check failed

In the network propagation function, the "undefined target" inconsistency is not properly handled.

https://github.com/wagenaartje/neataptic/blob/master/src/architecture/network.js#L78-L80

This:

    if (typeof target !== 'undefined' && target.length !== this.output) {
      throw new Error('Output target length should match network output length');
    }

Should be:

    if (typeof target == 'undefined' || target.length !== this.output) {
      throw new Error('Output target length should match network output length');
    }

Can't train MNIST digits to Perceptron

Hello, I'm having trouble to teach the MNIST dataset to a Perceptron. Here is a fiddle to see the code: https://jsfiddle.net/cazala/pn5zerq9/

I wanted to compare the performance to Synaptic's, since the readme claims to be 10x faster, and as the author of that library I'm very intrigued to see how you managed to get such a great speedup, but I couldn't make it train at all with a real dataset.

Here's that same piece of code using Synaptic so you get an idea of what I was trying to do: https://jsfiddle.net/cazala/672c4j1g/

I went over the docs but I couldn't find what am I doing wrong...

Oh, btw there seems to be a bug in the schedule.function, the docs say it receives a data object but I was getting undefined.. that's why line 15 is commented out.

Can't require() the dist version

I have been setting up mocha tests lately, and I want to run the tests on the distributed version ofcourse. However, when doing require('../dist/neataptic.js'); an empty object is returned. It works fine when running the tests on require('../src/neataptic.js');. Could someone lend me a helping hand?

High amount of mutation causes error

At this moment I'm investigating what is going on. But when a high amount of mutations is applied to a single genome, e.g. 10k mutations, some of the outputs my consistently output NaN. I'm very sure this is linked to one of the 6 new mutation methods.

Mutations and MOD_ACTIVATION bug?

Hi,

I'm new to neural networks and neataptic, so I assume that I'm misunderstanding something.

I'm trying to evolve a network with the following parameter for mutation:
neataptic.Methods.Mutation.FFW

However, if I print out the js mutation object, I get nulls in the allowed MOD_ACTIVATION property :

"mutation": [
   {
     "name": "ADD_NODE"
   },
   {
     "name": "SUB_NODE",
     "keep_gates": true
   },
   {
     "name": "ADD_CONN"
   },
   {
     "name": "REMOVE_CONN"
   },
   {
     "name": "MOD_WEIGHT",
     "min": -1,
     "max": 1
   },
   {
     "name": "MOD_BIAS",
     "min": -1,
     "max": 1
   },
   {
     "name": "MOD_ACTIVATION",
     "mutateOutput": true,
     "allowed": [
       null,
       null,
       null,
       null,
       null,
       null,
       null,
       null,
       null,
       null,
       null,
       null,
       null,
       null,
       null
     ]
   },
   {
     "name": "SWAP_NODES",
     "mutateOutput": true
   }
 ]

Is this expected? Am I doing something wrong?

I'm using the latest version of neataptic (as of today) v1.2.34

Thanks for any help.

NEAT wiki mistakes

Noticed two issues on the NEAT doc page (https://wagenaartje.github.io/neataptic/docs/neat/)

"selection
Sets the allowed selection methods used in the evolutionary process. Must be an array. Must be an array (e.g. [Selection.FITNESS_PROPORTIONATE]). Default is FITNESS_PROPORTIONATE."

It's not an array (it crashes if you try).

"equal
If you want to start the algorithm from a specific network, specify your network here."

This description is for the "network" option.

GPU Support with gpu.js?

As the titel already says. this is more a feature request (as from the closed post, you made, you did say to open a new issue for feature requests)

If i have to chose one java gpu solution i personally would go for:
http://gpu.rocks/

As this is even working with Intel Iris cards it looks promising. For example my laptop has with CPU only 0.415 seconds for the matrix modification and with GPU its down to 0.089 with a intel iris. As soon as iris works... all other GPU's also work and the browser vendors keep it improving.

just my 2 cents :-)

Endless repeat 'no more nodes left to remove!'

hello. I used the evolve method to test the mnist task. I referenced Tutorials in the evolution section and https://blog.webkid.io/neural-networks-in-javascript/ articles in this link.

var mnist = require('mnist');
var neataptic = require('neataptic');

var set = mnist.set(700,20);
var trainset = set.training;
var testset = set.test

var Network = neataptic.Network;
var evolve = neataptic.evolve;

var mynetwork = new Network(784,10);
mynetwork.evolve(trainset,{});

The results were not good. I waited 20 hours to see the training of the network complete. But 'no more nodes left to remove!' Only the message that was repeated infinitely.

No more nodes left to remove!
No more connections to be made!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
...

So this time, I tried using the train method instead of using the evolve method. Then the training was completed in just one minute. The test results were also good.

I can't know what's wrong. The same code worked well in the XOR example in evolution tutorial. The only difference is that the amount of data and the number of nodes are very little increased. Even that worked fine in the train method.

(Oh, while I was writing this question, now the results of the second attempt came out.

"CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory." 

He is dead. lol)

.

Obviously there will be something I missed about the evolve method. But I don't know what I do not know.

activate outputs NaN/Infinity

I'm inputting 5 values to the Neat activate function and I'm randomly getting NaN/Infinity on the latest version (1.1.7).
My inputs are a vaild number of 0-1. Eg

[0, 1E-08, 0.01336913588, 4E-08, 1.8E-07]

neat = new Neat(5, 1, null, {
            mutation: [
            Methods.Mutation.ADD_NODE,
            Methods.Mutation.SUB_NODE,
            Methods.Mutation.ADD_CONN,
            Methods.Mutation.SUB_CONN,
            Methods.Mutation.MOD_WEIGHT,
            Methods.Mutation.MOD_BIAS,
            Methods.Mutation.MOD_ACTIVATION,
            Methods.Mutation.ADD_GATE,
            Methods.Mutation.SUB_GATE,
            Methods.Mutation.ADD_SELF_CONN,
            Methods.Mutation.SUB_SELF_CONN,
            Methods.Mutation.ADD_BACK_CONN,
            Methods.Mutation.SUB_BACK_CONN
        ],
        
        mutationRate: 0.4,
        popsize: 500,
        elitism: 50
    });

....

var output = this.brain.activate(input);

Support for Minimal Fitness Function

I'm trying to implement sum of absolute differences as my fitness function; therefore, the minimal value returned is the "best" fit. It seems like this library doesn't support these typical well known distance formulas inside the fitness function by default; I'm having to do something like this in order for it to perform as expected:

return 10 - sum;

which seems hacky. Additionally, if I change the 10 to Number.MAX_VALUE, it doesn't seem like I get good results any more (which I assume is to some internal error threshold or something)

Is there a good reason as to why the neat fitness function doesn't default to the minimal value to support all the well known distance formulas by default?

I noticed that once upon a time this sort method used to be reversed in a way that I would think is expected: 69b44b8#diff-447d2789045ed8b6b82edba48b3301b6L65

@wagenaartje

A possibility to control the populations complexity in NEAT evolution.

Hi,
It's been a while I'm looking for a JS Neat library and I just stumbled upon neataptic, it's amazing :)

I have a suggestion:
It'll be interesting to have a way to control the networks complexity in NEAT evolution, as sometimes we want to find a solution but we don't want it to have more than certain number of Nodes/connections/gates ...

I see two possibilities to do that :
1 - adding three new options to Neat class : maxNodes, maxConns and maxGates, and add an optional marameter in Network.prototype.mutate to pass Neat object so that the code can have access to Neat options and exit Add_Node, Add_Connection and Add_Gate if they exeed the limit.

2 - instead of selecting a random mutation method in Neat.prototype.mutate

var mutationMethod = this.mutation[Math.floor(Math.random() * this.mutation.length)];

use a custom function to select the mutationMethod which will be a random by default.

if you think that one of the modifications (or both) are interesting I can propose a PR for one of them or both.

Network crossover bug

seems today i am getting lucky (or unlucky) getting this odd behaviours

capture
capture2

this might be because i am running the new mutations on a Neat Object.

new Neat() - cannot take the new mutation ADD_GATE?

wanted to give a test run with the new mutations, not sure if this is only allowed on the LSTM architecture but when i run with the Neat class i seem to always get stuck in this line:
this.connections.gated.push(connection);
saying that the gated is undefined.

do i need to create a LSTM network for this mutation to work?

Regards,
Tim

Large values causing `NaN` w/ some activation functions

During the creaton of some examples like the Agar.io AI and Target-seeking AI I noticed that some genomes suddenly turned invisible. I investigated for some time and noticed that some networks would start to output NaN.

Today deeper investigation revealed that there is a problem with the following activation functions

  • Activation.RELU
  • Activation.IDENTITY
  • Activation.SOFTPLUS
  • Activation.BENT_IDENTITY
  • Activation.ABSOLUTE

There all have one thing in common: either their lower bound is -Infinity or their upper bound is Infinity; a.k.a., they produce fairly large outputs.

One of the issues this is causing is with Activation.SOFTPLUS, any values around 1000 will result in Infinity, and the following functions will output NaN when receiving -+Infinity as input:

  • Activation.SOFTSIGN
  • Activation.SINUSOID
  • Activation.BENT_IDENTITY

I'm also noticing that the derivative of COMPLEMENTARY_LOG_LOG is causing some issues as an input of 1000 will cause Infinity as gradient and messes up node biases.


There are two ways I have to solve this at the moment: either delete the activation functions causing +-Infinity or delete the activation functions which don't accept +-Infinity. Both of these don't seem as a real option, so i'm open to any advice.

Hopfield different activation

Is it possible to create Hopfield with some other activation function, other than STEP?
Also, do you have some ideas how to implement self-organizing map?

More Articles / Tutorials

I'm a relative newbie to Neural networks and have been playing with a few different JS libraries for a couple weeks since they seem to have so much potential.

I recently made a self-driving car that seeks targets on the screen using synaptic.js. I trained it by having it take 3 directional movements (go forward, turn left, turn right) then calculated the distance to the target. Whichever of the 3 actions got the closest, I'd use synaptic's .propogate() method to teach it to do that "correct" movement action.

This seems to work OK and after 10s of thousands of iterations, the cars are successfully trained to seek targets.

I was curious however if I could upgrade the cars ability by using neaptatic to 'evolve' a better car without having to tell it what I think the correct movement action to take at every step (I'd like to just have the Neataptic execute one of the 3 random actions then just give the network a score/reward based on distance to the target so the network can figure out what actions to take and evolve better cars).

I repeatedly went over your XOR example but it seems you're actually teaching the network by giving the correct answer (Methods.Cost.MSE([0], genome.activate([0, 0]))) which seems closer to what I'm currently doing with Synaptic.js? Is there a way to teach these things via a positive/negative reward system instead of feeding it "correct" answers?

Is this kind of learning possible with this library? Could you give me some pointers on how I could do this kind of teaching with Neataptic or perhaps add some tutorials/articles that explain how to do this kind of thing?

Thanks and sorry this is so long.

should SWAP_NODES consider the mutateOutput options?

After turning off the mutations on the output with:

MOD_ACTIVATION.mutateOutput = false;

should we protect the output node when trying a SWAP_NODES mutation?

After updating with 7e75cd9 commit, I found myself with broken outputs so I disabled SWAP_NODES to continue.

Perhaps adding a SWAP_NODES.mutateOutput?

Shared weights

In order to amplify the results of neural networks on larger datasets (i.e. MNIST), we need to implement shared weights. This will also allow the creation of convolutional networks.

Best genome lost

I'm training by NEAT and I've seen that the best genome disappears. The same data is being used and I have 10% of the population as elitism.

For example my console logging shows:

Generation: 3 - average score: -966.382
Fittest score: -955.400, Best genome, nodes length: 6, nodes connections: 5
Generation: 4 - average score: -954.417
Fittest score: -213.391, Best genome, nodes length: 6, nodes connections: 5
Generation: 5 - average score: -978.215
Fittest score: -955.400, Best genome, nodes length: 6, nodes connections: 5
Generation: 6 - average score: -965.121
Fittest score: -955.400, Best genome, nodes length: 6, nodes connections: 5

I'm not sure if that is a bug or a feature. If this is a bug, I'll try to get more info if needed.

Architect activation

Hi,

Is possible to use the Architect.LSTM and define the activation method ?

Cheers,

Evolve does not stop after the specified generation

I ran evolve on a data set with error set to 0.004 and generation set to 1000. Evolution did not stop running until after generation 11100. I'm not actually sure why it decided to stop at that point as the error was not lower than the specified 0.004

Suggestions

Hi. This thread is meant to show you what i'm working on at the moment. At the same time, it is a place to make your suggestions on what to add/improve to Neataptic.

What I've done

  • A bigger variety of loss functions, basically this list - ✅ 4dc7e39 wiki
  • More activation functions check this out - ✅ 2fd4dbd wiki
  • Mutation method Mutation.SWAP_NODES - ✅ 605cda1 wiki
  • NARX networks - ✅ 0fdd4b0 wiki
  • GRU networks - ✅
  • Regularization - ✅
  • Actually making Selection.FITNESS_PROPORTIONATE proportionate, creating Selection.POWER ✅ - wiki
  • Batch training - ✅

What i'm looking into / working on

I can't guarantee that items on this list will be implemented!

Layer.LSTM not performing as well as Architect.LSTM

Currently I'm rewriting all the built-in networks code to construct networks form layers instead of groups and nodes. However, using groups/nodes to construct a LSTM network seems to always outperform using LSTM layers.

Simple one-layer LSTM network with layers

function LSTM(inputSize, hiddenSize, outputSize){
  var input = new Layer.Dense(inputSize);
  var hidden = new Layer.LSTM(hiddenSize);
  var output = new Layer.Dense(outputSize);

  input.connect(hidden, Methods.Connection.ALL_TO_ALL);
  hidden.connect(output, Methods.Connection.ALL_TO_ALL);

  // option.inputToOutput is set to true for Architect.LSTM
  if(true)
    input.connect(output, Methods.Connection.ALL_TO_ALL);

  return Architect.Construct([input, hidden, output]);
}

So:

var network = new LSTM(1, 6, 1);

// is the equivalent of

var network = new Architect.LSTM(1,6,1);

However, there is one key difference: Layer.LSTM consists of an output block itself. So when you connect Layer.LSTM to an output layer, there will be a group of size in between (this is needed if you connect two LSTM layers with each other). Architect.LSTM also does this, except for the output layer:

var outputBlock = i == blocks.length-1 ? outputLayer : new Group(block);

However, there is no way a Layer.LSTM can know that the next layer is a regular dense layer - if it would know, then no outputBlock would be needed.

Visualisation


Left: layer LSTM - Right: architect LSTM

You can clearly see the lack of gates/extra nodes in the second image. All these extra nodes and gates require a lot of extra iterations to converge to a solution.

Current solution idea

Make the Layer.Dense.input() function detect if the connecting layer is a LSTM. If it is, remove its outputBlock. This requires a workaround of the connect() function though.

This issue will not be fixed soon.

reverse outputs order in training

Hello. Thanks for great project!

Looks like there is bug in way you getting target values when propagating errors

propagate: function(rate, target){
    this.nodes.reverse();

    // Propagate nodes from end to start
    for(node in this.nodes){
      switch(this.nodes[node].type){
        case('hidden'):
          this.nodes[node].propagate(rate);
          break;
        case('output'):
          this.nodes[node].propagate(rate, target[node]);
          break;
      }
    }

    this.nodes.reverse();
  },

All nodes reversed, including outputs, but target values not.
By the way outputs can be in the middle of nodes array, not last, in this case target[node] can be undefined.
To save outputs order i suggest:

propagate: function(rate, target){
    this.nodes.reverse();
    var targetIndex = target.length;
    // Propagate nodes from end to start
    for(node in this.nodes){
      switch(this.nodes[node].type){
        case('hidden'):
          this.nodes[node].propagate(rate);
          break;
        case('output'):
          this.nodes[node].propagate(rate, target[--targetIndex]);
          break;
      }
    }

    this.nodes.reverse();
  },

network.train(...) iterations stop at random points

I'm trying to figure out why when I run network.train(...) with the options below (including the fact that I commented the error), the iterations randomly stop at a point between the 190th iteration and 250th iteration. Is there a simple solution to having it run all 1000 iterations?

image

Using this cdn for the script: <script src="https://wagenaartje.github.io/neataptic/cdn/neataptic.js"></script>

const training_options = {
log: 10,
// error: 0.03,
iterations: 1000,
rate: 0.3
}

tutorial problem

Hello. I am a beginner who is touched by the convenience of neataptic. :)

In the Training, Evolution section of the tutorial,

yourNetwork.train(yourData, yourOptions);
YourNetwork.evolve (yourData, yourOptions);

It says that the default values are preset, but the tutorial does not explained how to use the default values.

yourNetwork.train(yourData, {});
YourNetwork.evolve (yourData, {});

I've found that this is not a problem when I write this code. It can be quite confusing for beginners like me. (I honestly do not know it's right or wrong to write code like this.) but I have not seen a library that is kind enough for beginners as much as this library.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.