Meta-Learning with Shared Amortized Variational Inference

Ekaterina Iakovleva, Jakob Verbeek, Karteek Alahari
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4572-4582, 2020.

Abstract

We propose a novel amortized variational inference scheme for an empirical Bayes meta-learning model, where model parameters are treated as latent variables. We learn the prior distribution over model parameters conditioned on limited training data using a variational autoencoder approach. Our framework proposes sharing the same amortized inference network between the conditional prior and variational posterior distributions over the model parameters. While the posterior leverages both the labeled support and query data, the conditional prior is based only on the labeled support data. We show that in earlier work, relying on Monte-Carlo approximation, the conditional prior collapses to a Dirac delta function. In contrast, our variational approach prevents this collapse and preserves uncertainty over the model parameters. We evaluate our approach on the miniImageNet, CIFAR-FS and FC100 datasets, and present results demonstrating its advantages over previous work.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-iakovleva20a, title = {Meta-Learning with Shared Amortized Variational Inference}, author = {Iakovleva, Ekaterina and Verbeek, Jakob and Alahari, Karteek}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4572--4582}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/iakovleva20a/iakovleva20a.pdf}, url = {https://proceedings.mlr.press/v119/iakovleva20a.html}, abstract = {We propose a novel amortized variational inference scheme for an empirical Bayes meta-learning model, where model parameters are treated as latent variables. We learn the prior distribution over model parameters conditioned on limited training data using a variational autoencoder approach. Our framework proposes sharing the same amortized inference network between the conditional prior and variational posterior distributions over the model parameters. While the posterior leverages both the labeled support and query data, the conditional prior is based only on the labeled support data. We show that in earlier work, relying on Monte-Carlo approximation, the conditional prior collapses to a Dirac delta function. In contrast, our variational approach prevents this collapse and preserves uncertainty over the model parameters. We evaluate our approach on the miniImageNet, CIFAR-FS and FC100 datasets, and present results demonstrating its advantages over previous work.} }
Endnote
%0 Conference Paper %T Meta-Learning with Shared Amortized Variational Inference %A Ekaterina Iakovleva %A Jakob Verbeek %A Karteek Alahari %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-iakovleva20a %I PMLR %P 4572--4582 %U https://proceedings.mlr.press/v119/iakovleva20a.html %V 119 %X We propose a novel amortized variational inference scheme for an empirical Bayes meta-learning model, where model parameters are treated as latent variables. We learn the prior distribution over model parameters conditioned on limited training data using a variational autoencoder approach. Our framework proposes sharing the same amortized inference network between the conditional prior and variational posterior distributions over the model parameters. While the posterior leverages both the labeled support and query data, the conditional prior is based only on the labeled support data. We show that in earlier work, relying on Monte-Carlo approximation, the conditional prior collapses to a Dirac delta function. In contrast, our variational approach prevents this collapse and preserves uncertainty over the model parameters. We evaluate our approach on the miniImageNet, CIFAR-FS and FC100 datasets, and present results demonstrating its advantages over previous work.
APA
Iakovleva, E., Verbeek, J. & Alahari, K.. (2020). Meta-Learning with Shared Amortized Variational Inference. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4572-4582 Available from https://proceedings.mlr.press/v119/iakovleva20a.html.

Related Material