MetaFun: Meta-Learning with Iterative Functional Updates

Jin Xu, Jean-Francois Ton, Hyunjik Kim, Adam Kosiorek, Yee Whye Teh
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:10617-10627, 2020.

Abstract

We develop a functional encoder-decoder approach to supervised meta-learning, where labeled data is encoded into an infinite-dimensional functional representation rather than a finite-dimensional one. Furthermore, rather than directly producing the representation, we learn a neural update rule resembling functional gradient descent which iteratively improves the representation. The final representation is used to condition the decoder to make predictions on unlabeled data. Our approach is the first to demonstrates the success of encoder-decoder style meta-learning methods like conditional neural processes on large-scale few-shot classification benchmarks such as miniImageNet and tieredImageNet, where it achieves state-of-the-art performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-xu20i, title = {{M}eta{F}un: Meta-Learning with Iterative Functional Updates}, author = {Xu, Jin and Ton, Jean-Francois and Kim, Hyunjik and Kosiorek, Adam and Teh, Yee Whye}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {10617--10627}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/xu20i/xu20i.pdf}, url = {https://proceedings.mlr.press/v119/xu20i.html}, abstract = {We develop a functional encoder-decoder approach to supervised meta-learning, where labeled data is encoded into an infinite-dimensional functional representation rather than a finite-dimensional one. Furthermore, rather than directly producing the representation, we learn a neural update rule resembling functional gradient descent which iteratively improves the representation. The final representation is used to condition the decoder to make predictions on unlabeled data. Our approach is the first to demonstrates the success of encoder-decoder style meta-learning methods like conditional neural processes on large-scale few-shot classification benchmarks such as miniImageNet and tieredImageNet, where it achieves state-of-the-art performance.} }
Endnote
%0 Conference Paper %T MetaFun: Meta-Learning with Iterative Functional Updates %A Jin Xu %A Jean-Francois Ton %A Hyunjik Kim %A Adam Kosiorek %A Yee Whye Teh %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-xu20i %I PMLR %P 10617--10627 %U https://proceedings.mlr.press/v119/xu20i.html %V 119 %X We develop a functional encoder-decoder approach to supervised meta-learning, where labeled data is encoded into an infinite-dimensional functional representation rather than a finite-dimensional one. Furthermore, rather than directly producing the representation, we learn a neural update rule resembling functional gradient descent which iteratively improves the representation. The final representation is used to condition the decoder to make predictions on unlabeled data. Our approach is the first to demonstrates the success of encoder-decoder style meta-learning methods like conditional neural processes on large-scale few-shot classification benchmarks such as miniImageNet and tieredImageNet, where it achieves state-of-the-art performance.
APA
Xu, J., Ton, J., Kim, H., Kosiorek, A. & Teh, Y.W.. (2020). MetaFun: Meta-Learning with Iterative Functional Updates. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:10617-10627 Available from https://proceedings.mlr.press/v119/xu20i.html.

Related Material