Structured variational inference in Bayesian state-space models

Honggang Wang, Anirban Bhattacharya, Debdeep Pati, Yun Yang
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:8884-8905, 2022.

Abstract

Variational inference is routinely deployed in Bayesian state-space models as an efficient computational technique. Motivated by the inconsistency issue observed by Wang and Titterington (2004) for the mean-field approximation in linear state-space models, we consider a more expressive variational family for approximating the joint posterior of the latent variables to retain their dependence, while maintaining the mean-field (i.e. independence) structure between latent variables and parameters. In state-space models, such a latent structure adapted mean-field approximation can be efficiently computed using the belief propagation algorithm. Theoretically, we show that this adapted mean-field approximation achieves consistency of the variational estimates. Furthermore, we derive a non-asymptotic risk bound for an averaged alpha-divergence from the true data generating model, suggesting that the posterior mean of the best variational approximation for the static parameters shows optimal concentration. From a broader perspective, we add to the growing literature on statistical accuracy of variational approximations by allowing dependence between the latent variables, and the techniques developed here should be useful in related contexts.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-wang22g, title = { Structured variational inference in Bayesian state-space models }, author = {Wang, Honggang and Bhattacharya, Anirban and Pati, Debdeep and Yang, Yun}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {8884--8905}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/wang22g/wang22g.pdf}, url = {https://proceedings.mlr.press/v151/wang22g.html}, abstract = { Variational inference is routinely deployed in Bayesian state-space models as an efficient computational technique. Motivated by the inconsistency issue observed by Wang and Titterington (2004) for the mean-field approximation in linear state-space models, we consider a more expressive variational family for approximating the joint posterior of the latent variables to retain their dependence, while maintaining the mean-field (i.e. independence) structure between latent variables and parameters. In state-space models, such a latent structure adapted mean-field approximation can be efficiently computed using the belief propagation algorithm. Theoretically, we show that this adapted mean-field approximation achieves consistency of the variational estimates. Furthermore, we derive a non-asymptotic risk bound for an averaged alpha-divergence from the true data generating model, suggesting that the posterior mean of the best variational approximation for the static parameters shows optimal concentration. From a broader perspective, we add to the growing literature on statistical accuracy of variational approximations by allowing dependence between the latent variables, and the techniques developed here should be useful in related contexts. } }
Endnote
%0 Conference Paper %T Structured variational inference in Bayesian state-space models %A Honggang Wang %A Anirban Bhattacharya %A Debdeep Pati %A Yun Yang %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-wang22g %I PMLR %P 8884--8905 %U https://proceedings.mlr.press/v151/wang22g.html %V 151 %X Variational inference is routinely deployed in Bayesian state-space models as an efficient computational technique. Motivated by the inconsistency issue observed by Wang and Titterington (2004) for the mean-field approximation in linear state-space models, we consider a more expressive variational family for approximating the joint posterior of the latent variables to retain their dependence, while maintaining the mean-field (i.e. independence) structure between latent variables and parameters. In state-space models, such a latent structure adapted mean-field approximation can be efficiently computed using the belief propagation algorithm. Theoretically, we show that this adapted mean-field approximation achieves consistency of the variational estimates. Furthermore, we derive a non-asymptotic risk bound for an averaged alpha-divergence from the true data generating model, suggesting that the posterior mean of the best variational approximation for the static parameters shows optimal concentration. From a broader perspective, we add to the growing literature on statistical accuracy of variational approximations by allowing dependence between the latent variables, and the techniques developed here should be useful in related contexts.
APA
Wang, H., Bhattacharya, A., Pati, D. & Yang, Y.. (2022). Structured variational inference in Bayesian state-space models . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:8884-8905 Available from https://proceedings.mlr.press/v151/wang22g.html.

Related Material