Adaptive RaoBlackwellisation in Gibbs Sampling for Probabilistic Graphical Models
[edit]
Proceedings of Machine Learning Research, PMLR 89:29072915, 2019.
Abstract
RaoBlackwellisation is a technique that provably improves the performance of Gibbs sampling by summingout variables from the PGM. However, collapsing variables is computationally expensive, since it changes the PGM structure introducing factors whose size is dependent upon the Markov blanket of the variable. Therefore, collapsing out several variables jointly is typically intractable in arbitrary PGM structures. In this paper, we propose an adaptive approach for RaoBlackwellisation, where we add parallel Markov chains defined over different collapsed PGM structures. The collapsed variables are chosen based on their convergence diagnostics. However, adding a new chain requires burnin, thus wasting samples. To address this, we initialize the new chains from a mean field approximation for the distribution, that improves over time, thus reducing the burnin period. Our experiments on several UAI benchmarks shows that our approach is more accurate than stateoftheart inference systems such as Merlin that implements algorithms that have previously won the UAI inference challenge.
Related Material


