Giter VIP home page Giter VIP logo

gatenet's People

Contributors

chrisnielsen avatar kirill5pol avatar polzounov avatar

Stargazers

 avatar  avatar

Watchers

 avatar

gatenet's Issues

Save/Load Meta-optimizer

When using the save/load functionality of the meta-optimizer there are some issues for module-wise sharing. It seems that under the l2l/networks.py file, in the save function, the to_save variable returns a dictionary of modules rather than a single module.

The following error occurs:

  File ".../l2l/networks.py", line 50, in save
    variables = snt.get_variables_in_module(network)
  File .../python/modules/util.py", line 77, in get_variables_in_module
    return module.get_variables(collection=collection)
AttributeError: 'tuple' object has no attribute 'get_variables'

Do this fix and any others that need to be done to get the save/load functionality working.

Allow layers to have varying number of modules

Right now we assume the the graph definition will always have the same number of modules per layer. This will cause issues in displaying the gates (determine_gates in graph.py), and possibly others.

Two solutions may work here:

  1. Allow differing number of modules per layer by modifying determine_gates (different visualization method)
  2. (More likely) Raise errors when the number of modules is incorrect

Restructure Layer

Few things need to be done to clean up the Layer class:

  • Instead of defining the shapes of weights for the modules in Layer (lines 28-29 graph/layer.py), encapsulate the creation of the weights into Module
  • The modules should only take in input shape, output shape, and module type
  • Assuming that we only use Addition or Identity sublayers Module and Layer output shapes are the same

Clean up compute_gates (layer.py)

Make the code more readable and clear.

Also: assigning self.gates = gates_normalized could be improved for calling self.gates later on

Integrate Modules with Sonnet

The structure of Sonnet and our structure share many similarities.
It should be easy to integrate with Sonnet and get some of Sonnet's features without needing to develop it ourselves.

Integrate Sonnet by subclassing our modules to their snt.AbstractModule, this gives us variable sharing and easy integration with the code from learning to learn

Clean up process_layer (layers.py)

Right now this functions is built to work with 2d inputs only.
Currently a hack is used to convert 4d inputs into 2d, process the modules, then convert each output back into 4d.
Make this work properly with all inputs.

(L2L) - Put FlatteningHelper in the RNN itself instead of the meta optimizer

Right now the meta optimizer has to do all of the reshaping to get gradients and deltas in the right shapes in order to work with the RNNs.
Instead of this write a wrapper that works for any RNN that can take in a list of (grads, vars) and output the same type of list.

What we have now in the meta optimizer (dirty!)

        with tf.name_scope('meta_optmizer_step'):
            for optimizer in self._optimizers
                list_deltas = []
                with tf.name_scope('deltas'):
                    OptimizerType = optimizer.rnn # same as optimizer[0]
                    flat_helper = optimizer.flat_helper # same as optimizer[1]

                    # Flatten the gradients from list of (gradient, variable) 
                    # into single tensor (batch_size, k)
                    flattened_grads = flat_helper.flatten(gradients)
                    # Run step for the RNN optimizer
                    flattened_deltas = OptimizerType(flattened_grads)
                    # Get deltas back into original form
                    deltas = flat_helper.unflatten(flattened_deltas)
                    list_deltas.append(deltas)

What we want:

        with tf.name_scope('meta_optmizer_step'):
            for optimizer in self._optimizers
                list_deltas = []
                with tf.name_scope('deltas'):
                    OptimizerType = optimizer.rnn 

                    # Both gradients and deltas are lists of tuples of (gradient, variable)
                    # Run step for the RNN optimizer
                    deltas = OptimizerType(gradients)
                    list_deltas.append(deltas)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.