All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference

Rob Brekelmans, Vaden Masrani, Frank Wood, Greg Ver Steeg, Aram Galstyan
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1111-1122, 2020.

Abstract

The recently proposed Thermodynamic Variational Objective (TVO) leverages thermodynamic integration to provide a family of variational inference objectives, which both tighten and generalize the ubiquitous Evidence Lower Bound (ELBO). However, the tightness of TVO bounds was not previously known, an expensive grid search was used to choose a “schedule” of intermediate distributions, and model learning suffered with ostensibly tighter bounds. In this work, we propose an exponential family interpretation of the geometric mixture curve underlying the TVO and various path sampling methods, which allows us to characterize the gap in TVO likelihood bounds as a sum of KL divergences. We propose to choose intermediate distributions using equal spacing in the moment parameters of our exponential family, which matches grid search performance and allows the schedule to adaptively update over the course of training. Finally, we derive a doubly reparameterized gradient estimator which improves model learning and allows the TVO to benefit from more refined bounds. To further contextualize our contributions, we provide a unified framework for understanding thermodynamic integration and the TVO using Taylor series remainders.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-brekelmans20a, title = {All in the Exponential Family: {B}regman Duality in Thermodynamic Variational Inference}, author = {Brekelmans, Rob and Masrani, Vaden and Wood, Frank and Steeg, Greg Ver and Galstyan, Aram}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1111--1122}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/brekelmans20a/brekelmans20a.pdf}, url = {https://proceedings.mlr.press/v119/brekelmans20a.html}, abstract = {The recently proposed Thermodynamic Variational Objective (TVO) leverages thermodynamic integration to provide a family of variational inference objectives, which both tighten and generalize the ubiquitous Evidence Lower Bound (ELBO). However, the tightness of TVO bounds was not previously known, an expensive grid search was used to choose a “schedule” of intermediate distributions, and model learning suffered with ostensibly tighter bounds. In this work, we propose an exponential family interpretation of the geometric mixture curve underlying the TVO and various path sampling methods, which allows us to characterize the gap in TVO likelihood bounds as a sum of KL divergences. We propose to choose intermediate distributions using equal spacing in the moment parameters of our exponential family, which matches grid search performance and allows the schedule to adaptively update over the course of training. Finally, we derive a doubly reparameterized gradient estimator which improves model learning and allows the TVO to benefit from more refined bounds. To further contextualize our contributions, we provide a unified framework for understanding thermodynamic integration and the TVO using Taylor series remainders.} }
Endnote
%0 Conference Paper %T All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference %A Rob Brekelmans %A Vaden Masrani %A Frank Wood %A Greg Ver Steeg %A Aram Galstyan %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-brekelmans20a %I PMLR %P 1111--1122 %U https://proceedings.mlr.press/v119/brekelmans20a.html %V 119 %X The recently proposed Thermodynamic Variational Objective (TVO) leverages thermodynamic integration to provide a family of variational inference objectives, which both tighten and generalize the ubiquitous Evidence Lower Bound (ELBO). However, the tightness of TVO bounds was not previously known, an expensive grid search was used to choose a “schedule” of intermediate distributions, and model learning suffered with ostensibly tighter bounds. In this work, we propose an exponential family interpretation of the geometric mixture curve underlying the TVO and various path sampling methods, which allows us to characterize the gap in TVO likelihood bounds as a sum of KL divergences. We propose to choose intermediate distributions using equal spacing in the moment parameters of our exponential family, which matches grid search performance and allows the schedule to adaptively update over the course of training. Finally, we derive a doubly reparameterized gradient estimator which improves model learning and allows the TVO to benefit from more refined bounds. To further contextualize our contributions, we provide a unified framework for understanding thermodynamic integration and the TVO using Taylor series remainders.
APA
Brekelmans, R., Masrani, V., Wood, F., Steeg, G.V. & Galstyan, A.. (2020). All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1111-1122 Available from https://proceedings.mlr.press/v119/brekelmans20a.html.

Related Material