Comments (4)
- No, if you go through the math, only 2nd-order terms arise, even when using multiple gradient steps. This is because the gradient steps after the first step are not w.r.t. theta.
- In the few-shot supervised setting, a task is a set of N classes. Hence, if a dataset has 1200 classes, then there are 1200 choose N tasks total.
- In general, there is no known measure of similarity. The test tasks and the training tasks should be sampled from the same distribution.
- See the stop_grad flag in this repo. We stop the gradient through the gradient term, cutting the dependency of grad_theta on theta.
- Roughly, yes. The directions come from the gradient, so the algorithm is trying to find an initialization such that the gradient points in correct directions. In practice, there is not just one optimum, but a whole space of optimum, particularly with large & deep, overparameterized neural networks. In this way, Figure 1 can be misleading; the optimization is more flexible than it looks.
from maml.
Hi Chelsea,
Thank you very much for your detailed response. I think I understood your explanations but I have one more question regarding the supervised classification setting. Suppose you want to do N-way K-shot classification. In your experiments, is it the case that the batch size of the inner gradient is N x K? (I ask this question because when you call the main.py you define batch size = 1 but later you define the tensor as batch size x number of classes and from what you told me that you define as a task in the classification setting, I think N x K is the correct answer.)
Moreover, during test time, do you still use the same batch size, i.e. N x K? Last, during test time, you simply use gradient descent, right?
Thank you in advance.
Aris Papadopoulos
from maml.
Yes, that is correct
from maml.
Both the questions and answers are very good, which help me understand the paper better. Many thanks!
from maml.
Related Issues (20)
- ModuleNotFoundError: No module named 'tensorflow.contrib' HOT 1
- Pretrained weights for omniglot
- Some question about cross-entropy loss xent
- How to open the results in LOGs
- Memory efficient MAML
- wrong comparison in maml.py
- What weights you are using during testing
- Prelosses and Postlosses HOT 2
- First-order approximation HOT 11
- Sinusoid results
- include how to cite paper on read me
- Definition of tasks in the Meta Phase
- Why 20000 tasks are set, but only one batch task is used HOT 1
- something about Computational Graph
- Testing always creates same results across 600 test points
- Why was a scheduler/annealing rate not used?
- itr is referenced before assignment in main.py, line 160
- model parameters of meta-training
- Interpretation of the results HOT 1
- Create tests file to highlight usage
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from maml.