Skip to content

The original code for the paper "How to train your MAML" along with a replication of the original "Model Agnostic Meta Learning" (MAML) paper in Pytorch.

License

Notifications You must be signed in to change notification settings

bamos/HowToTrainYourMAMLPytorch

 
 

Repository files navigation

MAML++ with higher model+optimizer exploration

Here we modify the official MAML++ code to use higher for the model and optimizer so we can ablate across them. We also add hydra for experiment management.

About

The original code for the paper "How to train your MAML" along with a replication of the original "Model Agnostic Meta Learning" (MAML) paper in Pytorch.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 96.4%
  • Python 3.6%