Consistency of ELBO maximization for model selection

Badr-Eddine Cherief-Abdellatif
Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference, PMLR 96:11-31, 2019.

Abstract

The Evidence Lower Bound (ELBO) is a quantity that plays a key role in variational inference. It can also be used as a criterion in model selection. However, though extremely popular in practice in the variational Bayes community, there has never been a general theoretic justication for selecting based on the ELBO. In this paper, we show that the ELBO maximization strategy has strong theoretical guarantees, and is robust to model misspeciffication while most works rely on the assumption that one model is correctly speciffied. We illustrate our theoretical results by an application to the selection of the number of principal components in probabilistic PCA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v96-cherief-abdellatif19a, title = {Consistency of ELBO maximization for model selection}, author = {Cherief-Abdellatif, Badr-Eddine}, booktitle = {Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference}, pages = {11--31}, year = {2019}, editor = {Ruiz, Francisco and Zhang, Cheng and Liang, Dawen and Bui, Thang}, volume = {96}, series = {Proceedings of Machine Learning Research}, month = {02 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v96/cherief-abdellatif19a/cherief-abdellatif19a.pdf}, url = {https://proceedings.mlr.press/v96/cherief-abdellatif19a.html}, abstract = {The Evidence Lower Bound (ELBO) is a quantity that plays a key role in variational inference. It can also be used as a criterion in model selection. However, though extremely popular in practice in the variational Bayes community, there has never been a general theoretic justication for selecting based on the ELBO. In this paper, we show that the ELBO maximization strategy has strong theoretical guarantees, and is robust to model misspeciffication while most works rely on the assumption that one model is correctly speciffied. We illustrate our theoretical results by an application to the selection of the number of principal components in probabilistic PCA.} }
Endnote
%0 Conference Paper %T Consistency of ELBO maximization for model selection %A Badr-Eddine Cherief-Abdellatif %B Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2019 %E Francisco Ruiz %E Cheng Zhang %E Dawen Liang %E Thang Bui %F pmlr-v96-cherief-abdellatif19a %I PMLR %P 11--31 %U https://proceedings.mlr.press/v96/cherief-abdellatif19a.html %V 96 %X The Evidence Lower Bound (ELBO) is a quantity that plays a key role in variational inference. It can also be used as a criterion in model selection. However, though extremely popular in practice in the variational Bayes community, there has never been a general theoretic justication for selecting based on the ELBO. In this paper, we show that the ELBO maximization strategy has strong theoretical guarantees, and is robust to model misspeciffication while most works rely on the assumption that one model is correctly speciffied. We illustrate our theoretical results by an application to the selection of the number of principal components in probabilistic PCA.
APA
Cherief-Abdellatif, B.. (2019). Consistency of ELBO maximization for model selection. Proceedings of The 1st Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 96:11-31 Available from https://proceedings.mlr.press/v96/cherief-abdellatif19a.html.

Related Material