MAML is a model-agnostic optimization-based meta learning algorithm. It meta-trains a model to learn a parameter initialization such that it can be fine-tuned to a different task in a single gradient update.
This repository implements second-order MAML on the omniglot dataset
- PyTorch
- OpenCV
- Numpy
- Tqdm
- Download the Omniglot Dataset's images_background.zip and images_evaluation.zip splits here.
- Unzip the files in
omniglot/
directory. - Run the train.py script to start the training with default options. Run
python train.py -h
to get a description of the arguments. - For evaluation, run evaluate.py script.
- To make predictions on new data, refer Test.ipynb.
- Alternatively,
- https://github.com/oscarknagg/few-shot
- Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks https://arxiv.org/abs/1703.03400