On Statistical Optimality of Variational Bayes

Debdeep Pati, Anirban Bhattacharya, Yun Yang
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1579-1588, 2018.

Abstract

The article addresses a long-standing open problem on the justification of using variational Bayes methods for parameter estimation. We provide general conditions for obtaining optimal risk bounds for point estimates acquired from mean-field variational Bayesian inference. The conditions pertain to the existence of certain test functions for the distance metric on the parameter space and minimal assumptions on the prior. A general recipe for verification of the conditions is outlined which is broadly applicable to existing Bayesian models with or without latent variables. As illustrations, specific applications to Latent Dirichlet Allocation and Gaussian mixture models are discussed.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-pati18a, title = {On Statistical Optimality of Variational Bayes}, author = {Pati, Debdeep and Bhattacharya, Anirban and Yang, Yun}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1579--1588}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/pati18a/pati18a.pdf}, url = {https://proceedings.mlr.press/v84/pati18a.html}, abstract = {The article addresses a long-standing open problem on the justification of using variational Bayes methods for parameter estimation. We provide general conditions for obtaining optimal risk bounds for point estimates acquired from mean-field variational Bayesian inference. The conditions pertain to the existence of certain test functions for the distance metric on the parameter space and minimal assumptions on the prior. A general recipe for verification of the conditions is outlined which is broadly applicable to existing Bayesian models with or without latent variables. As illustrations, specific applications to Latent Dirichlet Allocation and Gaussian mixture models are discussed. } }
Endnote
%0 Conference Paper %T On Statistical Optimality of Variational Bayes %A Debdeep Pati %A Anirban Bhattacharya %A Yun Yang %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-pati18a %I PMLR %P 1579--1588 %U https://proceedings.mlr.press/v84/pati18a.html %V 84 %X The article addresses a long-standing open problem on the justification of using variational Bayes methods for parameter estimation. We provide general conditions for obtaining optimal risk bounds for point estimates acquired from mean-field variational Bayesian inference. The conditions pertain to the existence of certain test functions for the distance metric on the parameter space and minimal assumptions on the prior. A general recipe for verification of the conditions is outlined which is broadly applicable to existing Bayesian models with or without latent variables. As illustrations, specific applications to Latent Dirichlet Allocation and Gaussian mixture models are discussed.
APA
Pati, D., Bhattacharya, A. & Yang, Y.. (2018). On Statistical Optimality of Variational Bayes. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1579-1588 Available from https://proceedings.mlr.press/v84/pati18a.html.

Related Material