Giter VIP home page Giter VIP logo

srianant / dnn_hyperparameter_optimization Goto Github PK

View Code? Open in Web Editor NEW
11.0 2.0 6.0 111.62 MB

Neural Network Hyperparameter Optimization Framework in Python using Distributed Tensorflow Architecture

Python 1.98% Shell 0.07% Jupyter Notebook 0.12% HTML 0.70% Assembly 0.06% Batchfile 0.01% C 70.91% C++ 5.71% Objective-C 0.01% Roff 0.01% Java 3.02% JavaScript 1.90% PHP 4.01% Ruby 11.47% Perl 0.01% PLSQL 0.01% PLpgSQL 0.03% Blade 0.01% TSQL 0.01%

dnn_hyperparameter_optimization's Introduction

Neural Network Hyper Parameter Optimization (NHOP) Using Distributed TensorFlow

Version: 1.0
Python: 2.7 & 3.x
TensorFlow: 1.0.1

Introduction:

The problem of identifying a good value for hyper- parameter λ usually through an iterative and approximate method is called the problem of hyper-parameter optimization. The computation performed by learning algorithm A involves an inner-loop which is iterative and an outer-loop algorithm that optimizes for hyper-parameters. Given in the form of equation below [1]

where,
Ex is generalization error
Gx is grand truth distribution
X (train) is a finite set of samples x from Gx
L is expected loss L (x; f) over finite samples
Aλ actual learning algorithm with λ

NHOP Software framework helps achieve above objective. The framework uses python as its frontend and distributed TensorFlow as its backend for training neural network.

The Inner-loop optimization is performed using TensorFlow API which are implemented as dataflow-like models. The computations are expressed as stateful dataflow graphs. The Outer-loop called “Optimizer” is a python process that computes hyper-parameters from hyperspace using Random Search [1] algorithm and feeds them to inner-loop which trains the finite set of samples X (train) iteratively that minimizes some of the expected loss. Given all that what we need in practice is a way to choose λ so as minimize generalization error. NHOP framework allows users to specify distribution bounds (or) search space for hyper- parameter.

Architecture:

Neural Network Hyperparameter Optimization Framework in Python using Distributed Tensorflow Architecture.

OPTIMIZER forks multiple PS(Parameter Server) and WORKER(Training Server) python process. These processes will further run Distributed TensorFlow Architecture. Framework supports following Deep Neural Network (DNN) TensorFlow Models:

  1. Feed Forward DNN Regressor
  2. RNN-LSTM Classifier (example for custom model)

Design:

Software design

Pre Requisites:

This framework uses sacred tool and mongodb server for ease of use. Please follow the instructions below for SacredBoard & MongoDB Installation:

$ brew install mongodb # install mongodb
$ mkdir mongo # create local directory for mongodb to write to
$ mongod --dbpath mongo # start mongodb server and tell it to write to local folder mongo

$ pip install git+https://github.com/IDSIA/sacred.git # install latest version of sacred
$ pip install sacredboard # install sacredboard

start sacredboard server (optional..needed to view sacredboard dasboard. CPU intensive when used.)
$ sacredboard

Parameter Configuration:

DEFAULT CONFIG:

$ python optimizer.py print_config

CUSTOM CONFIG:

(Edit optimizer_config.yaml file as required)

$ python optimizer.py print_config with optimizer_config.yaml

RUN OPTIMIZER:

$ python optimizer.py

RUN WITH SPECIFIC PARAMETER CHANGE:

$ python optimizer.py with train_epoch=500

TO VIEW OPTIMIZER RUN HISTORY:

Make sure to keep sacredboard and mongoDB server running when executing above python commandline.
To view optimizer run history use http://127.0.0.1:5000/runs

dnn_hyperparameter_optimization's People

Contributors

srianant avatar

Stargazers

 avatar The Supermanyam avatar wangpengcheng avatar Ali Mosavian avatar sw avatar Gavin Fang avatar ariczeng avatar  avatar Ramsey avatar Zhang avatar Nguyen Anh Quynh avatar

Watchers

James Cloos avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.