Skip to content

Latest commit

 

History

History
23 lines (18 loc) · 1.3 KB

readme.md

File metadata and controls

23 lines (18 loc) · 1.3 KB

Model-Agnostic Meta-Learning

MAML is a model-agnostic optimization-based meta learning algorithm. It meta-trains a model to learn a parameter initialization such that it can be fine-tuned to a different task in a single gradient update.

This repository implements second-order MAML on the omniglot dataset

Requirements

  • PyTorch
  • OpenCV
  • Numpy
  • Tqdm

Usage

  1. Download the Omniglot Dataset's images_background.zip and images_evaluation.zip splits here.
  2. Unzip the files in omniglot/ directory.
  3. Run the train.py script to start the training with default options. Run python train.py -h to get a description of the arguments.
  4. For evaluation, run evaluate.py script.
  5. To make predictions on new data, refer Test.ipynb.
  6. Alternatively, Open In Colab

References