Variational Predictive Information Bottleneck

[edit]

Alexander A. Alemi ;
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, PMLR 118:1-6, 2020.

Abstract

In classic papers, Zellner (1988, 2002) demonstrated that ayesian inference could be derived as the solution to an information theoretic functional. Below we derive a generalized form of this functional as a variational lower bound of a predictive information bottleneck objective. This generalized functional encompasses most modern inference procedures and suggests novel ones.

Related Material