Q-conjugate Message Passing for Efficient Bayesian Inference

Mykola Lukashchuk, İsmail Şenöz, Bert de Vries
Proceedings of The 12th International Conference on Probabilistic Graphical Models, PMLR 246:295-311, 2024.

Abstract

Bayesian inference in nonconjugate models such as Bayesian Poisson regression often relies on computationally expensive Monte Carlo methods. This paper introduces {Q}-conjugacy, a generalization of classical conjugacy that enables efficient closed-form variational inference in certain nonconjugate models. {Q}-conjugacy is a condition in which a closed-form update scheme expresses the solution minimizing the Kullback-Leibler divergence between a variational distribution and the product of two potentially unnormalized distributions. Leveraging {Q}-conjugacy within a local message passing framework allows deriving analytic inference update equations for nonconjugate models. The effectiveness of this approach is demonstrated on Bayesian Poisson regression and a model involving a hidden gamma-distributed latent variable with Gaussian-corrupted logarithmic observations. Results show that {Q}-conjugate triplets, such as (Gamma, LogNormal, Gamma), provide better speed-accuracy trade-offs than Markov Chain Monte Carlo.

Cite this Paper


BibTeX
@InProceedings{pmlr-v246-lukashchuk24a, title = {{Q}-conjugate Message Passing for Efficient Bayesian Inference}, author = {Lukashchuk, Mykola and {\c{S}en\"{o}z}, {\.{I}}smail and {de Vries}, Bert}, booktitle = {Proceedings of The 12th International Conference on Probabilistic Graphical Models}, pages = {295--311}, year = {2024}, editor = {Kwisthout, Johan and Renooij, Silja}, volume = {246}, series = {Proceedings of Machine Learning Research}, month = {11--13 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v246/main/assets/lukashchuk24a/lukashchuk24a.pdf}, url = {https://proceedings.mlr.press/v246/lukashchuk24a.html}, abstract = {Bayesian inference in nonconjugate models such as Bayesian Poisson regression often relies on computationally expensive Monte Carlo methods. This paper introduces {Q}-conjugacy, a generalization of classical conjugacy that enables efficient closed-form variational inference in certain nonconjugate models. {Q}-conjugacy is a condition in which a closed-form update scheme expresses the solution minimizing the Kullback-Leibler divergence between a variational distribution and the product of two potentially unnormalized distributions. Leveraging {Q}-conjugacy within a local message passing framework allows deriving analytic inference update equations for nonconjugate models. The effectiveness of this approach is demonstrated on Bayesian Poisson regression and a model involving a hidden gamma-distributed latent variable with Gaussian-corrupted logarithmic observations. Results show that {Q}-conjugate triplets, such as (Gamma, LogNormal, Gamma), provide better speed-accuracy trade-offs than Markov Chain Monte Carlo.} }
Endnote
%0 Conference Paper %T Q-conjugate Message Passing for Efficient Bayesian Inference %A Mykola Lukashchuk %A İsmail Şenöz %A Bert de Vries %B Proceedings of The 12th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2024 %E Johan Kwisthout %E Silja Renooij %F pmlr-v246-lukashchuk24a %I PMLR %P 295--311 %U https://proceedings.mlr.press/v246/lukashchuk24a.html %V 246 %X Bayesian inference in nonconjugate models such as Bayesian Poisson regression often relies on computationally expensive Monte Carlo methods. This paper introduces {Q}-conjugacy, a generalization of classical conjugacy that enables efficient closed-form variational inference in certain nonconjugate models. {Q}-conjugacy is a condition in which a closed-form update scheme expresses the solution minimizing the Kullback-Leibler divergence between a variational distribution and the product of two potentially unnormalized distributions. Leveraging {Q}-conjugacy within a local message passing framework allows deriving analytic inference update equations for nonconjugate models. The effectiveness of this approach is demonstrated on Bayesian Poisson regression and a model involving a hidden gamma-distributed latent variable with Gaussian-corrupted logarithmic observations. Results show that {Q}-conjugate triplets, such as (Gamma, LogNormal, Gamma), provide better speed-accuracy trade-offs than Markov Chain Monte Carlo.
APA
Lukashchuk, M., Şenöz, İ. & de Vries, B.. (2024). Q-conjugate Message Passing for Efficient Bayesian Inference. Proceedings of The 12th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 246:295-311 Available from https://proceedings.mlr.press/v246/lukashchuk24a.html.

Related Material