Giter VIP home page Giter VIP logo

adacnn's Introduction

Hi there! Amazing fellow human being ๐Ÿ‘‹

I am a data scientist and I love getting my hands ๐Ÿคš and feet ๐Ÿฆถ wet in problem infested โ˜  murky waters ๐Ÿšค of data, to unearth invaluable information and actionable insights ๐Ÿ’Ž.

What do I love?

  • ๐Ÿ Python
  • ๐Ÿค– Machine Learning, Deep Learning, Artificial Intelligence
  • ๐Ÿงฎ TensorFlow

Some of my publications

adacnn's People

Contributors

thushv89 avatar

Watchers

 avatar  avatar  avatar

adacnn's Issues

New policy for adapting structure

1st Epoch add actions randomly
2nd epoch add actions deterministically
3rd epoch remove actions randomly
4th epoch onwards prune network

Major Bug: In updating target network weights

The weights were being updated wrongly. If w is the weights of actor/critic and w' are the weights of target actor/critic, I was doing
w' <- w' * tau + (1-tau) * w'
instead of
w' <- w * tau + (1-tau) * w'

Try out new action space

All the values for actions are sigmoids
Global action -1 - 0 - +1 decide wheterh to remove that proportion from all layers or add or no adapt

Things to keep in mind

Be careful how you use dropout. For small networks using large dropout causes masive performance degradations.
Use a small validation set accumulation rate
Restart momentum at every start of a new task (or use pool momentum because the pool gets to a uniform distribution over time)

If you are scaling weights to make the expected sum 1, do this consistently for both weights and bias.
normalizing factor for adaptive nets. Very bad

Try the followign way of optimizing AdaCNN

After add operation,
for the next m batches, (until next action) optimize only the newly added filters with the new data batches.
Then finetune the whole net

After remove operation
Finetune the whole net

detect what's causing NaNs

Possible causes:
L2 decay: No
Adaptive dropout: No
Pool Momentum:No
Way params removed:
Data distribution

Currently,
Set use_dropout to False
use_l2_loss to False
pool_momentum = 0.0
rm_indices are a continuous block
** no finetune after remove action ** This seems to be the cause of NaNs

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.