Rapid Adaptation with Conditionally Shifted Neurons

Tsendsuren Munkhdalai, Xingdi Yuan, Soroush Mehri, Adam Trischler
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3664-3673, 2018.

Abstract

We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons. We apply this mechanism in the framework of metalearning, where the aim is to replicate some of the flexibility of human learning in machines. Conditionally shifted neurons modify their activation values with task-specific shifts retrieved from a memory module, which is populated rapidly based on limited task experience. On metalearning benchmarks from the vision and language domains, models augmented with conditionally shifted neurons achieve state-of-the-art results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-munkhdalai18a, title = {Rapid Adaptation with Conditionally Shifted Neurons}, author = {Munkhdalai, Tsendsuren and Yuan, Xingdi and Mehri, Soroush and Trischler, Adam}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3664--3673}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/munkhdalai18a/munkhdalai18a.pdf}, url = {https://proceedings.mlr.press/v80/munkhdalai18a.html}, abstract = {We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons. We apply this mechanism in the framework of metalearning, where the aim is to replicate some of the flexibility of human learning in machines. Conditionally shifted neurons modify their activation values with task-specific shifts retrieved from a memory module, which is populated rapidly based on limited task experience. On metalearning benchmarks from the vision and language domains, models augmented with conditionally shifted neurons achieve state-of-the-art results.} }
Endnote
%0 Conference Paper %T Rapid Adaptation with Conditionally Shifted Neurons %A Tsendsuren Munkhdalai %A Xingdi Yuan %A Soroush Mehri %A Adam Trischler %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-munkhdalai18a %I PMLR %P 3664--3673 %U https://proceedings.mlr.press/v80/munkhdalai18a.html %V 80 %X We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons. We apply this mechanism in the framework of metalearning, where the aim is to replicate some of the flexibility of human learning in machines. Conditionally shifted neurons modify their activation values with task-specific shifts retrieved from a memory module, which is populated rapidly based on limited task experience. On metalearning benchmarks from the vision and language domains, models augmented with conditionally shifted neurons achieve state-of-the-art results.
APA
Munkhdalai, T., Yuan, X., Mehri, S. & Trischler, A.. (2018). Rapid Adaptation with Conditionally Shifted Neurons. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3664-3673 Available from https://proceedings.mlr.press/v80/munkhdalai18a.html.

Related Material