Variational Inference and Model Selection with Generalized Evidence Bounds

Liqun Chen, Chenyang Tao, Ruiyi Zhang, Ricardo Henao, Lawrence Carin Duke
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:893-902, 2018.

Abstract

Recent advances on the scalability and flexibility of variational inference have made it successful at unravelling hidden patterns in complex data. In this work we propose a new variational bound formulation, yielding an estimator that extends beyond the conventional variational bound. It naturally subsumes the importance-weighted and Renyi bounds as special cases, and it is provably sharper than these counterparts. We also present an improved estimator for variational learning, and advocate a novel high signal-to-variance ratio update rule for the variational parameters. We discuss model-selection issues associated with existing evidence-lower-bound-based variational inference procedures, and show how to leverage the flexibility of our new formulation to address them. Empirical evidence is provided to validate our claims.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-chen18k, title = {Variational Inference and Model Selection with Generalized Evidence Bounds}, author = {Chen, Liqun and Tao, Chenyang and Zhang, Ruiyi and Henao, Ricardo and Duke, Lawrence Carin}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {893--902}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/chen18k/chen18k.pdf}, url = {https://proceedings.mlr.press/v80/chen18k.html}, abstract = {Recent advances on the scalability and flexibility of variational inference have made it successful at unravelling hidden patterns in complex data. In this work we propose a new variational bound formulation, yielding an estimator that extends beyond the conventional variational bound. It naturally subsumes the importance-weighted and Renyi bounds as special cases, and it is provably sharper than these counterparts. We also present an improved estimator for variational learning, and advocate a novel high signal-to-variance ratio update rule for the variational parameters. We discuss model-selection issues associated with existing evidence-lower-bound-based variational inference procedures, and show how to leverage the flexibility of our new formulation to address them. Empirical evidence is provided to validate our claims.} }
Endnote
%0 Conference Paper %T Variational Inference and Model Selection with Generalized Evidence Bounds %A Liqun Chen %A Chenyang Tao %A Ruiyi Zhang %A Ricardo Henao %A Lawrence Carin Duke %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-chen18k %I PMLR %P 893--902 %U https://proceedings.mlr.press/v80/chen18k.html %V 80 %X Recent advances on the scalability and flexibility of variational inference have made it successful at unravelling hidden patterns in complex data. In this work we propose a new variational bound formulation, yielding an estimator that extends beyond the conventional variational bound. It naturally subsumes the importance-weighted and Renyi bounds as special cases, and it is provably sharper than these counterparts. We also present an improved estimator for variational learning, and advocate a novel high signal-to-variance ratio update rule for the variational parameters. We discuss model-selection issues associated with existing evidence-lower-bound-based variational inference procedures, and show how to leverage the flexibility of our new formulation to address them. Empirical evidence is provided to validate our claims.
APA
Chen, L., Tao, C., Zhang, R., Henao, R. & Duke, L.C.. (2018). Variational Inference and Model Selection with Generalized Evidence Bounds. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:893-902 Available from https://proceedings.mlr.press/v80/chen18k.html.

Related Material