This NN is a simple neural network coded in Julia. It contains a forward propagation function and a backpropagation function.
This project contains different activation functions, here is the list :
- Sigmoid
- TanH
- ReLU
- PReLU
- Softmax
- ELU
- Identity
- Binary
These functions are also derived, so these different functions are easily usable in an NN
To use the NN, you need to download Julia.
So you will have to link your function to add features (an example function is given). Finally you can add a function to plot the data, like Plots or Gadfly.
In itself, this NN is not directly usable, it is more a kind of demonstration of the mathematical aspect of an NN.
Activation functions and derived functions can be easily added to a project.
Julia : https://julialang.org/
Blog : https://aidri.github.io/emping/