Iterative Amortized Inference

Joe Marino, Yisong Yue, Stephan Mandt
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3403-3412, 2018.

Abstract

Inference models are a key component in scaling variational inference to deep latent variable models, most notably as encoder networks in variational auto-encoders (VAEs). By replacing conventional optimization-based inference with a learned model, inference is amortized over data examples and therefore more computationally efficient. However, standard inference models are restricted to direct mappings from data to approximate posterior estimates. The failure of these models to reach fully optimized approximate posterior estimates results in an amortization gap. We aim toward closing this gap by proposing iterative inference models, which learn to perform inference optimization through repeatedly encoding gradients. Our approach generalizes standard inference models in VAEs and provides insight into several empirical findings, including top-down inference techniques. We demonstrate the inference optimization capabilities of iterative inference models and show that they outperform standard inference models on several benchmark data sets of images and text.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-marino18a, title = {Iterative Amortized Inference}, author = {Marino, Joe and Yue, Yisong and Mandt, Stephan}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3403--3412}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/marino18a/marino18a.pdf}, url = {https://proceedings.mlr.press/v80/marino18a.html}, abstract = {Inference models are a key component in scaling variational inference to deep latent variable models, most notably as encoder networks in variational auto-encoders (VAEs). By replacing conventional optimization-based inference with a learned model, inference is amortized over data examples and therefore more computationally efficient. However, standard inference models are restricted to direct mappings from data to approximate posterior estimates. The failure of these models to reach fully optimized approximate posterior estimates results in an amortization gap. We aim toward closing this gap by proposing iterative inference models, which learn to perform inference optimization through repeatedly encoding gradients. Our approach generalizes standard inference models in VAEs and provides insight into several empirical findings, including top-down inference techniques. We demonstrate the inference optimization capabilities of iterative inference models and show that they outperform standard inference models on several benchmark data sets of images and text.} }
Endnote
%0 Conference Paper %T Iterative Amortized Inference %A Joe Marino %A Yisong Yue %A Stephan Mandt %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-marino18a %I PMLR %P 3403--3412 %U https://proceedings.mlr.press/v80/marino18a.html %V 80 %X Inference models are a key component in scaling variational inference to deep latent variable models, most notably as encoder networks in variational auto-encoders (VAEs). By replacing conventional optimization-based inference with a learned model, inference is amortized over data examples and therefore more computationally efficient. However, standard inference models are restricted to direct mappings from data to approximate posterior estimates. The failure of these models to reach fully optimized approximate posterior estimates results in an amortization gap. We aim toward closing this gap by proposing iterative inference models, which learn to perform inference optimization through repeatedly encoding gradients. Our approach generalizes standard inference models in VAEs and provides insight into several empirical findings, including top-down inference techniques. We demonstrate the inference optimization capabilities of iterative inference models and show that they outperform standard inference models on several benchmark data sets of images and text.
APA
Marino, J., Yue, Y. & Mandt, S.. (2018). Iterative Amortized Inference. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3403-3412 Available from https://proceedings.mlr.press/v80/marino18a.html.

Related Material