Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood

Kohei Hayashi, Shin-ichi Maeda, Ryohei Fujimaki
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1358-1366, 2015.

Abstract

Factorized information criterion (FIC) is a recently developed approximation technique for the marginal log-likelihood, which provides an automatic model selection framework for a few latent variable models (LVMs) with tractable inference algorithms. This paper reconsiders FIC and fills theoretical gaps of previous FIC studies. First, we reveal the core idea of FIC that allows generalization for a broader class of LVMs, including continuous LVMs, in contrast to previous FICs, which are applicable only to binary LVMs. Second, we investigate the model selection mechanism of the generalized FIC. Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies. Third, we provide an interpretation of FIC as a variational free energy and uncover previously-unknown their relationship. A demonstrative study on Bayesian principal component analysis is provided and numerical experiments support our theoretical results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-hayashi15, title = {Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood}, author = {Hayashi, Kohei and Maeda, Shin-ichi and Fujimaki, Ryohei}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1358--1366}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/hayashi15.pdf}, url = {https://proceedings.mlr.press/v37/hayashi15.html}, abstract = {Factorized information criterion (FIC) is a recently developed approximation technique for the marginal log-likelihood, which provides an automatic model selection framework for a few latent variable models (LVMs) with tractable inference algorithms. This paper reconsiders FIC and fills theoretical gaps of previous FIC studies. First, we reveal the core idea of FIC that allows generalization for a broader class of LVMs, including continuous LVMs, in contrast to previous FICs, which are applicable only to binary LVMs. Second, we investigate the model selection mechanism of the generalized FIC. Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies. Third, we provide an interpretation of FIC as a variational free energy and uncover previously-unknown their relationship. A demonstrative study on Bayesian principal component analysis is provided and numerical experiments support our theoretical results.} }
Endnote
%0 Conference Paper %T Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood %A Kohei Hayashi %A Shin-ichi Maeda %A Ryohei Fujimaki %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-hayashi15 %I PMLR %P 1358--1366 %U https://proceedings.mlr.press/v37/hayashi15.html %V 37 %X Factorized information criterion (FIC) is a recently developed approximation technique for the marginal log-likelihood, which provides an automatic model selection framework for a few latent variable models (LVMs) with tractable inference algorithms. This paper reconsiders FIC and fills theoretical gaps of previous FIC studies. First, we reveal the core idea of FIC that allows generalization for a broader class of LVMs, including continuous LVMs, in contrast to previous FICs, which are applicable only to binary LVMs. Second, we investigate the model selection mechanism of the generalized FIC. Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies. Third, we provide an interpretation of FIC as a variational free energy and uncover previously-unknown their relationship. A demonstrative study on Bayesian principal component analysis is provided and numerical experiments support our theoretical results.
RIS
TY - CPAPER TI - Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood AU - Kohei Hayashi AU - Shin-ichi Maeda AU - Ryohei Fujimaki BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-hayashi15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1358 EP - 1366 L1 - http://proceedings.mlr.press/v37/hayashi15.pdf UR - https://proceedings.mlr.press/v37/hayashi15.html AB - Factorized information criterion (FIC) is a recently developed approximation technique for the marginal log-likelihood, which provides an automatic model selection framework for a few latent variable models (LVMs) with tractable inference algorithms. This paper reconsiders FIC and fills theoretical gaps of previous FIC studies. First, we reveal the core idea of FIC that allows generalization for a broader class of LVMs, including continuous LVMs, in contrast to previous FICs, which are applicable only to binary LVMs. Second, we investigate the model selection mechanism of the generalized FIC. Our analysis provides a formal justification of FIC as a model selection criterion for LVMs and also a systematic procedure for pruning redundant latent variables that have been removed heuristically in previous studies. Third, we provide an interpretation of FIC as a variational free energy and uncover previously-unknown their relationship. A demonstrative study on Bayesian principal component analysis is provided and numerical experiments support our theoretical results. ER -
APA
Hayashi, K., Maeda, S. & Fujimaki, R.. (2015). Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1358-1366 Available from https://proceedings.mlr.press/v37/hayashi15.html.

Related Material