Asymptotic Behavior of the Coordinate Ascent Variational Inference in Singular Models

Sean C Plummer, Anirban Bhattacharya, Debdeep Pati, Yun Yang
Conference on Parsimony and Learning, PMLR 280:652-674, 2025.

Abstract

Mean-field approximations are widely used for efficiently approximating high-dimensional integrals. While the efficacy of such approximations is well understood for well-behaved likelihoods, it is not clear how accurately it can approximate the marginal likelihood associated with a highly non log-concave singular model. In this article, we provide a case study of the convergence behavior of coordinate ascent variational inference (CAVI) in the context of a general $d$-dimensional singular model in standard form. We prove that for a general $d$-dimensional singular model in standard form with real log canonical threshold (RLCT) $\lambda$ and multiplicity $m$, the CAVI system converges to one of $m$ locally attracting fixed points. Furthermore, at each of these fixed points, the evidence lower bound (ELBO) of the system recovers the leading-order behavior of the asymptotic expansion of the log marginal likelihood predicted by \citet{watanabe1999algebraic, watanabe2001algebraic, watanabe2001balgebraic}. Our empirical results demonstrate that for models with multiplicity $m=1$ the ELBO provides a tighter approximation to the log-marginal likelihood than the asymptotic approximation $-\lambda \log n + o( \log \log n)$ of \citet{watanabe1999algebraic}.

Cite this Paper


BibTeX
@InProceedings{pmlr-v280-plummer25a, title = {Asymptotic Behavior of the Coordinate Ascent Variational Inference in Singular Models}, author = {Plummer, Sean C and Bhattacharya, Anirban and Pati, Debdeep and Yang, Yun}, booktitle = {Conference on Parsimony and Learning}, pages = {652--674}, year = {2025}, editor = {Chen, Beidi and Liu, Shijia and Pilanci, Mert and Su, Weijie and Sulam, Jeremias and Wang, Yuxiang and Zhu, Zhihui}, volume = {280}, series = {Proceedings of Machine Learning Research}, month = {24--27 Mar}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v280/main/assets/plummer25a/plummer25a.pdf}, url = {https://proceedings.mlr.press/v280/plummer25a.html}, abstract = {Mean-field approximations are widely used for efficiently approximating high-dimensional integrals. While the efficacy of such approximations is well understood for well-behaved likelihoods, it is not clear how accurately it can approximate the marginal likelihood associated with a highly non log-concave singular model. In this article, we provide a case study of the convergence behavior of coordinate ascent variational inference (CAVI) in the context of a general $d$-dimensional singular model in standard form. We prove that for a general $d$-dimensional singular model in standard form with real log canonical threshold (RLCT) $\lambda$ and multiplicity $m$, the CAVI system converges to one of $m$ locally attracting fixed points. Furthermore, at each of these fixed points, the evidence lower bound (ELBO) of the system recovers the leading-order behavior of the asymptotic expansion of the log marginal likelihood predicted by \citet{watanabe1999algebraic, watanabe2001algebraic, watanabe2001balgebraic}. Our empirical results demonstrate that for models with multiplicity $m=1$ the ELBO provides a tighter approximation to the log-marginal likelihood than the asymptotic approximation $-\lambda \log n + o( \log \log n)$ of \citet{watanabe1999algebraic}.} }
Endnote
%0 Conference Paper %T Asymptotic Behavior of the Coordinate Ascent Variational Inference in Singular Models %A Sean C Plummer %A Anirban Bhattacharya %A Debdeep Pati %A Yun Yang %B Conference on Parsimony and Learning %C Proceedings of Machine Learning Research %D 2025 %E Beidi Chen %E Shijia Liu %E Mert Pilanci %E Weijie Su %E Jeremias Sulam %E Yuxiang Wang %E Zhihui Zhu %F pmlr-v280-plummer25a %I PMLR %P 652--674 %U https://proceedings.mlr.press/v280/plummer25a.html %V 280 %X Mean-field approximations are widely used for efficiently approximating high-dimensional integrals. While the efficacy of such approximations is well understood for well-behaved likelihoods, it is not clear how accurately it can approximate the marginal likelihood associated with a highly non log-concave singular model. In this article, we provide a case study of the convergence behavior of coordinate ascent variational inference (CAVI) in the context of a general $d$-dimensional singular model in standard form. We prove that for a general $d$-dimensional singular model in standard form with real log canonical threshold (RLCT) $\lambda$ and multiplicity $m$, the CAVI system converges to one of $m$ locally attracting fixed points. Furthermore, at each of these fixed points, the evidence lower bound (ELBO) of the system recovers the leading-order behavior of the asymptotic expansion of the log marginal likelihood predicted by \citet{watanabe1999algebraic, watanabe2001algebraic, watanabe2001balgebraic}. Our empirical results demonstrate that for models with multiplicity $m=1$ the ELBO provides a tighter approximation to the log-marginal likelihood than the asymptotic approximation $-\lambda \log n + o( \log \log n)$ of \citet{watanabe1999algebraic}.
APA
Plummer, S.C., Bhattacharya, A., Pati, D. & Yang, Y.. (2025). Asymptotic Behavior of the Coordinate Ascent Variational Inference in Singular Models. Conference on Parsimony and Learning, in Proceedings of Machine Learning Research 280:652-674 Available from https://proceedings.mlr.press/v280/plummer25a.html.

Related Material