Giter VIP home page Giter VIP logo

neuralnetwork's Introduction

What's this?

Java (convolutional or fully-connected) neural network implementation with plugin for [Weka] (http://www.cs.waikato.ac.nz/ml/weka/). Uses dropout and rectified linear units. Implementation is multithreaded and uses [MTJ] (https://github.com/fommil/matrix-toolkits-java) matrix library with native libs for performance.

Installation

Weka

Go to https://github.com/amten/NeuralNetwork/releases/latest to find the latest release. Download the files NeuralNetwork.zip and BLAS-dlls.zip. In Weka, go to Tools/Package Manager and press the "File/URL" button. Browse to the NeuralNetwork.zip file and press "ok".

Important! For optimal performance, you need to install native matrix library files.
Windows: Unzip the BLAS-dlls.zip file to Wekas install dir (".../Program Files/Weka-3-7").
Ubuntu: Run "sudo apt-get install libatlas3-base libopenblas-base" in a terminal window.

Standalone

This package was made mainly to be used from the Weka UI, but it can be used in your own java code as well.

Go to https://github.com/amten/NeuralNetwork/releases/latest to find the latest release. Download the file NeuralNetwork.zip and unzip.

Include the files NeuralNetwork.jar, lib/mtj-1.0-snapshot.jar, lib/opencsv-2.3.jar in your classpath.

Important! For optimal performance, you need to install native matrix library files.
Windows: Unzip the BLAS-dlls.zip file to the directory where you execute your application, or any other directory in the PATH.
Ubuntu: Run "sudo apt-get install libatlas3-base libopenblas-base" in a terminal window.

Usage

Weka

In Weka, you will find the classifier under classifiers/functions/NeuralNetwork. For explanations of the settings, click the "more" button.

Note 1: If you start Weka with console (alternative available in the windows start menu), you will get printouts of cost during each iteration of training and you can press enter in the console window to halt the training.

Note 2: When using dropout as regularization, it might still be a good idea to keep a small weight penalty. This keeps weights from exploding and causing overflows.

Note 3: When using convolutional layers, it seems to be most efficient to use batch-size=1 (i.e. Stochastic Gradient Descent)

Standalone

Example code showing classification and regression can be found here: https://github.com/amten/NeuralNetwork/tree/master/src/amten/ml/examples

License

Free to copy and modify. Please include author name if you copy code.

neuralnetwork's People

Contributors

amten avatar

Stargazers

Kalyan Penta avatar

Watchers

Kalyan Penta avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.