Variational Predictive Information Bottleneck

Alexander A. Alemi
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, PMLR 118:1-6, 2020.

Abstract

In classic papers, Zellner (1988, 2002) demonstrated that ayesian inference could be derived as the solution to an information theoretic functional. Below we derive a generalized form of this functional as a variational lower bound of a predictive information bottleneck objective. This generalized functional encompasses most modern inference procedures and suggests novel ones.

Cite this Paper


BibTeX
@InProceedings{pmlr-v118-alemi20a, title = { Variational Predictive Information Bottleneck}, author = {Alemi, Alexander A.}, booktitle = {Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference}, pages = {1--6}, year = {2020}, editor = {Zhang, Cheng and Ruiz, Francisco and Bui, Thang and Dieng, Adji Bousso and Liang, Dawen}, volume = {118}, series = {Proceedings of Machine Learning Research}, month = {08 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v118/alemi20a/alemi20a.pdf}, url = {https://proceedings.mlr.press/v118/alemi20a.html}, abstract = { In classic papers, Zellner (1988, 2002) demonstrated that ayesian inference could be derived as the solution to an information theoretic functional. Below we derive a generalized form of this functional as a variational lower bound of a predictive information bottleneck objective. This generalized functional encompasses most modern inference procedures and suggests novel ones.} }
Endnote
%0 Conference Paper %T Variational Predictive Information Bottleneck %A Alexander A. Alemi %B Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2020 %E Cheng Zhang %E Francisco Ruiz %E Thang Bui %E Adji Bousso Dieng %E Dawen Liang %F pmlr-v118-alemi20a %I PMLR %P 1--6 %U https://proceedings.mlr.press/v118/alemi20a.html %V 118 %X In classic papers, Zellner (1988, 2002) demonstrated that ayesian inference could be derived as the solution to an information theoretic functional. Below we derive a generalized form of this functional as a variational lower bound of a predictive information bottleneck objective. This generalized functional encompasses most modern inference procedures and suggests novel ones.
APA
Alemi, A.A.. (2020). Variational Predictive Information Bottleneck. Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 118:1-6 Available from https://proceedings.mlr.press/v118/alemi20a.html.

Related Material