Cavia
Code for "Fast Context Adaptation via Meta-Learning"
Install / Use
/learn @lmzintgraf/CaviaREADME
CAVIA
Code for "Fast Context Adaptation via Meta-Learning" - Luisa M Zintgraf, Kyriacos Shiarlis, Vitaly Kurin, Katja Hofmann, Shimon Whiteson (ICML 2019).
I used Python 3.7 and PyTorch 1.0.1 for these experiments.
Regression
-
Running experiments:
To run the experiment with default settings, execute
python3 regression/main.pyThis will run the sine curve experiment. To run the CelebA image completion experiment, run
python3 regression/main --task celeba --num_context_params 128 --num_hidden_layers 128 128 128 128 128 --k_meta_test 1024To change the number of context parameters, use the flag
--num_context_params.To run MAML with the default settings, run
python3 regression/main.py --maml --num_context_params 0 --lr_inner 0.1For default settings and other argument options, see
classification/arguments.py -
CelebA dataset:
If you want to use the code for the CelebA dataset, you have to download it (
http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html) and change the path intasks_celeba.py.
Classification
-
Running experiments:
To run the experiment with default settings, execute in your command line:
python3 classification/cavia.pyUse the
--num_filtersflag to set the number of filters. For default settings and other argument options, seearguments.py. -
Retrieving Mini-Imagenet:
You need the Mini-Imagenet dataset to run these experiments. See e.g.
https://github.com/y2l/mini-imagenet-toolsfor how to retrieve it. Put them in the folderclassification/data/miniimagenet/images/(the label files are already in there).
Reinforcement Learning
This code is an extended version of Tristan Deleu's PyTorch MAML implementation: https://github.com/tristandeleu/pytorch-maml-rl.
-
Prerequisites:
For the MuJoCo experiments you need
mujoco-pyand OpenAI gym. -
Running experiments:
To run an experiment on the 2D navigation, use the following command:
python3 main.py --env-name 2DNavigation-v0 --fast-lr 1.0 --phi-size 5 0 --output-folder results
Acknowledgements
Special thanks to Chelsea Finn, Jackie Loong and Tristan Deleu for their open-sourced MAML implementations. This was of great help to us, and parts of our implementation are based on the PyTorch code from:
- Jackie Loong's implementation of MAML,
https://github.com/dragen1860/MAML-Pytorch - Tristan Deleu's implementation of MAML-RL,
https://github.com/tristandeleu/pytorch-maml-rl
BibTex
@article{zintgraf2018cavia,
title={Fast Context Adaptation via Meta-Learning},
author={Zintgraf, Luisa M and Shiarlis, Kyriacos and Kurin, Vitaly and Hofmann, Katja and Whiteson, Shimon},
conference={Thirty-sixth International Conference on Machine Learning (ICML 2019)},
year={2019}
}
Related Skills
proje
Interactive vocabulary learning platform with smart flashcards and spaced repetition for effective language acquisition.
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
flutter-tutor
Flutter Learning Tutor Guide You are a friendly computer science tutor specializing in Flutter development. Your role is to guide the student through learning Flutter step by step, not to provide d
groundhog
398Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
