Variational Inference in Location-Scale Families: Exact Recovery of the Mean and Correlation Matrix

Charles Margossian, Lawrence K. Saul
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3466-3474, 2025.

Abstract

Given an intractable target density $p$, variational inference (VI) attempts to find the best approximation $q$ from a tractable family $\mathcal Q$. This is typically done by minimizing the exclusive Kullback-Leibler divergence, $\text{KL}(q||p)$. In practice, $\mathcal Q$ is not rich enough to contain $p$, and the approximation is misspecified even when it is a unique global minimizer of $\text{KL}(q||p)$. In this paper, we analyze the robustness of VI to these misspecifications when $p$ exhibits certain symmetries and $\mathcal Q$ is a location-scale family that shares these symmetries. We prove strong guarantees for VI not only under mild regularity conditions but also in the face of severe misspecifications. Namely, we show that (i) VI recovers the mean of $p$ when $p$ exhibits an even symmetry, and (ii) it recovers the correlation matrix of $p$ when in addition $p$ exhibits an elliptical symmetry. These guarantees hold for the mean even when $q$ is factorized and $p$ is not, and for the correlation matrix even when $q$ and $p$ behave differently in their tails. We analyze various regimes of Bayesian inference where these symmetries are useful idealizations, and we also investigate experimentally how VI behaves in their absence.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-margossian25a, title = {Variational Inference in Location-Scale Families: Exact Recovery of the Mean and Correlation Matrix}, author = {Margossian, Charles and Saul, Lawrence K.}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3466--3474}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/margossian25a/margossian25a.pdf}, url = {https://proceedings.mlr.press/v258/margossian25a.html}, abstract = {Given an intractable target density $p$, variational inference (VI) attempts to find the best approximation $q$ from a tractable family $\mathcal Q$. This is typically done by minimizing the exclusive Kullback-Leibler divergence, $\text{KL}(q||p)$. In practice, $\mathcal Q$ is not rich enough to contain $p$, and the approximation is misspecified even when it is a unique global minimizer of $\text{KL}(q||p)$. In this paper, we analyze the robustness of VI to these misspecifications when $p$ exhibits certain symmetries and $\mathcal Q$ is a location-scale family that shares these symmetries. We prove strong guarantees for VI not only under mild regularity conditions but also in the face of severe misspecifications. Namely, we show that (i) VI recovers the mean of $p$ when $p$ exhibits an even symmetry, and (ii) it recovers the correlation matrix of $p$ when in addition $p$ exhibits an elliptical symmetry. These guarantees hold for the mean even when $q$ is factorized and $p$ is not, and for the correlation matrix even when $q$ and $p$ behave differently in their tails. We analyze various regimes of Bayesian inference where these symmetries are useful idealizations, and we also investigate experimentally how VI behaves in their absence.} }
Endnote
%0 Conference Paper %T Variational Inference in Location-Scale Families: Exact Recovery of the Mean and Correlation Matrix %A Charles Margossian %A Lawrence K. Saul %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-margossian25a %I PMLR %P 3466--3474 %U https://proceedings.mlr.press/v258/margossian25a.html %V 258 %X Given an intractable target density $p$, variational inference (VI) attempts to find the best approximation $q$ from a tractable family $\mathcal Q$. This is typically done by minimizing the exclusive Kullback-Leibler divergence, $\text{KL}(q||p)$. In practice, $\mathcal Q$ is not rich enough to contain $p$, and the approximation is misspecified even when it is a unique global minimizer of $\text{KL}(q||p)$. In this paper, we analyze the robustness of VI to these misspecifications when $p$ exhibits certain symmetries and $\mathcal Q$ is a location-scale family that shares these symmetries. We prove strong guarantees for VI not only under mild regularity conditions but also in the face of severe misspecifications. Namely, we show that (i) VI recovers the mean of $p$ when $p$ exhibits an even symmetry, and (ii) it recovers the correlation matrix of $p$ when in addition $p$ exhibits an elliptical symmetry. These guarantees hold for the mean even when $q$ is factorized and $p$ is not, and for the correlation matrix even when $q$ and $p$ behave differently in their tails. We analyze various regimes of Bayesian inference where these symmetries are useful idealizations, and we also investigate experimentally how VI behaves in their absence.
APA
Margossian, C. & Saul, L.K.. (2025). Variational Inference in Location-Scale Families: Exact Recovery of the Mean and Correlation Matrix. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3466-3474 Available from https://proceedings.mlr.press/v258/margossian25a.html.

Related Material