gatenet's People
gatenet's Issues
More complex layer & module generation
Right now we have a single module type throughout the graph, create a generation function to allow for more complicated structure
Save/Load Meta-optimizer
When using the save/load functionality of the meta-optimizer there are some issues for module-wise sharing. It seems that under the l2l/networks.py
file, in the save
function, the to_save
variable returns a dictionary of modules rather than a single module.
The following error occurs:
File ".../l2l/networks.py", line 50, in save
variables = snt.get_variables_in_module(network)
File .../python/modules/util.py", line 77, in get_variables_in_module
return module.get_variables(collection=collection)
AttributeError: 'tuple' object has no attribute 'get_variables'
Do this fix and any others that need to be done to get the save/load functionality working.
Allow layers to have varying number of modules
Right now we assume the the graph definition will always have the same number of modules per layer. This will cause issues in displaying the gates (determine_gates
in graph.py), and possibly others.
Two solutions may work here:
- Allow differing number of modules per layer by modifying
determine_gates
(different visualization method) - (More likely) Raise errors when the number of modules is incorrect
Restructure Layer
Few things need to be done to clean up the Layer class:
- Instead of defining the shapes of weights for the modules in Layer (lines 28-29 graph/layer.py), encapsulate the creation of the weights into Module
- The modules should only take in input shape, output shape, and module type
- Assuming that we only use Addition or Identity sublayers Module and Layer output shapes are the same
Clean up compute_gates (layer.py)
Make the code more readable and clear.
Also: assigning self.gates = gates_normalized
could be improved for calling self.gates
later on
Get the meta-optimizer working
Integrate Modules with Sonnet
The structure of Sonnet and our structure share many similarities.
It should be easy to integrate with Sonnet and get some of Sonnet's features without needing to develop it ourselves.
Integrate Sonnet by subclassing our modules to their snt.AbstractModule, this gives us variable sharing and easy integration with the code from learning to learn
Clean up process_layer (layers.py)
Right now this functions is built to work with 2d inputs only.
Currently a hack is used to convert 4d inputs into 2d, process the modules, then convert each output back into 4d.
Make this work properly with all inputs.
(L2L) - Put FlatteningHelper in the RNN itself instead of the meta optimizer
Right now the meta optimizer has to do all of the reshaping to get gradients and deltas in the right shapes in order to work with the RNNs.
Instead of this write a wrapper that works for any RNN that can take in a list of (grads, vars) and output the same type of list.
What we have now in the meta optimizer (dirty!)
with tf.name_scope('meta_optmizer_step'):
for optimizer in self._optimizers
list_deltas = []
with tf.name_scope('deltas'):
OptimizerType = optimizer.rnn # same as optimizer[0]
flat_helper = optimizer.flat_helper # same as optimizer[1]
# Flatten the gradients from list of (gradient, variable)
# into single tensor (batch_size, k)
flattened_grads = flat_helper.flatten(gradients)
# Run step for the RNN optimizer
flattened_deltas = OptimizerType(flattened_grads)
# Get deltas back into original form
deltas = flat_helper.unflatten(flattened_deltas)
list_deltas.append(deltas)
What we want:
with tf.name_scope('meta_optmizer_step'):
for optimizer in self._optimizers
list_deltas = []
with tf.name_scope('deltas'):
OptimizerType = optimizer.rnn
# Both gradients and deltas are lists of tuples of (gradient, variable)
# Run step for the RNN optimizer
deltas = OptimizerType(gradients)
list_deltas.append(deltas)
Meta-learning task setup
Add support for the meta-training problem setup.
Eg. like the following (image from OPTIMIZATION AS A MODEL FOR FEW-SHOT LEARNING)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.