Stein Variational Message Passing for Continuous Graphical Models

Dilin Wang, Zhe Zeng, Qiang Liu
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:5219-5227, 2018.

Abstract

We propose a novel distributed inference algorithm for continuous graphical models, by extending Stein variational gradient descent (SVGD) to leverage the Markov dependency structure of the distribution of interest. Our approach combines SVGD with a set of structured local kernel functions defined on the Markov blanket of each node, which alleviates the curse of high dimensionality and simultaneously yields a distributed algorithm for decentralized inference tasks. We justify our method with theoretical analysis and show that the use of local kernels can be viewed as a new type of localized approximation that matches the target distribution on the conditional distributions of each node over its Markov blanket. Our empirical results show that our method outperforms a variety of baselines including standard MCMC and particle message passing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-wang18l, title = {Stein Variational Message Passing for Continuous Graphical Models}, author = {Wang, Dilin and Zeng, Zhe and Liu, Qiang}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {5219--5227}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/wang18l/wang18l.pdf}, url = {https://proceedings.mlr.press/v80/wang18l.html}, abstract = {We propose a novel distributed inference algorithm for continuous graphical models, by extending Stein variational gradient descent (SVGD) to leverage the Markov dependency structure of the distribution of interest. Our approach combines SVGD with a set of structured local kernel functions defined on the Markov blanket of each node, which alleviates the curse of high dimensionality and simultaneously yields a distributed algorithm for decentralized inference tasks. We justify our method with theoretical analysis and show that the use of local kernels can be viewed as a new type of localized approximation that matches the target distribution on the conditional distributions of each node over its Markov blanket. Our empirical results show that our method outperforms a variety of baselines including standard MCMC and particle message passing methods.} }
Endnote
%0 Conference Paper %T Stein Variational Message Passing for Continuous Graphical Models %A Dilin Wang %A Zhe Zeng %A Qiang Liu %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-wang18l %I PMLR %P 5219--5227 %U https://proceedings.mlr.press/v80/wang18l.html %V 80 %X We propose a novel distributed inference algorithm for continuous graphical models, by extending Stein variational gradient descent (SVGD) to leverage the Markov dependency structure of the distribution of interest. Our approach combines SVGD with a set of structured local kernel functions defined on the Markov blanket of each node, which alleviates the curse of high dimensionality and simultaneously yields a distributed algorithm for decentralized inference tasks. We justify our method with theoretical analysis and show that the use of local kernels can be viewed as a new type of localized approximation that matches the target distribution on the conditional distributions of each node over its Markov blanket. Our empirical results show that our method outperforms a variety of baselines including standard MCMC and particle message passing methods.
APA
Wang, D., Zeng, Z. & Liu, Q.. (2018). Stein Variational Message Passing for Continuous Graphical Models. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:5219-5227 Available from https://proceedings.mlr.press/v80/wang18l.html.

Related Material