Fast Context Adaptation via Meta-Learning

Luisa Zintgraf, Kyriacos Shiarli, Vitaly Kurin, Katja Hofmann, Shimon Whiteson
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7693-7702, 2019.

Abstract

We propose CAVIA for meta-learning, a simple extension to MAML that is less prone to meta-overfitting, easier to parallelise, and more interpretable. CAVIA partitions the model parameters into two parts: context parameters that serve as additional input to the model and are adapted on individual tasks, and shared parameters that are meta-trained and shared across tasks. At test time, only the context parameters are updated, leading to a low-dimensional task representation. We show empirically that CAVIA outperforms MAML for regression, classification, and reinforcement learning. Our experiments also highlight weaknesses in current benchmarks, in that the amount of adaptation needed in some cases is small.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-zintgraf19a, title = {Fast Context Adaptation via Meta-Learning}, author = {Zintgraf, Luisa and Shiarli, Kyriacos and Kurin, Vitaly and Hofmann, Katja and Whiteson, Shimon}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7693--7702}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/zintgraf19a/zintgraf19a.pdf}, url = {https://proceedings.mlr.press/v97/zintgraf19a.html}, abstract = {We propose CAVIA for meta-learning, a simple extension to MAML that is less prone to meta-overfitting, easier to parallelise, and more interpretable. CAVIA partitions the model parameters into two parts: context parameters that serve as additional input to the model and are adapted on individual tasks, and shared parameters that are meta-trained and shared across tasks. At test time, only the context parameters are updated, leading to a low-dimensional task representation. We show empirically that CAVIA outperforms MAML for regression, classification, and reinforcement learning. Our experiments also highlight weaknesses in current benchmarks, in that the amount of adaptation needed in some cases is small.} }
Endnote
%0 Conference Paper %T Fast Context Adaptation via Meta-Learning %A Luisa Zintgraf %A Kyriacos Shiarli %A Vitaly Kurin %A Katja Hofmann %A Shimon Whiteson %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-zintgraf19a %I PMLR %P 7693--7702 %U https://proceedings.mlr.press/v97/zintgraf19a.html %V 97 %X We propose CAVIA for meta-learning, a simple extension to MAML that is less prone to meta-overfitting, easier to parallelise, and more interpretable. CAVIA partitions the model parameters into two parts: context parameters that serve as additional input to the model and are adapted on individual tasks, and shared parameters that are meta-trained and shared across tasks. At test time, only the context parameters are updated, leading to a low-dimensional task representation. We show empirically that CAVIA outperforms MAML for regression, classification, and reinforcement learning. Our experiments also highlight weaknesses in current benchmarks, in that the amount of adaptation needed in some cases is small.
APA
Zintgraf, L., Shiarli, K., Kurin, V., Hofmann, K. & Whiteson, S.. (2019). Fast Context Adaptation via Meta-Learning. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7693-7702 Available from https://proceedings.mlr.press/v97/zintgraf19a.html.

Related Material