[edit]
Particle Gibbs with Ancestor Sampling for Probabilistic Programs
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:986-994, 2015.
Abstract
Particle Markov chain Monte Carlo techniques rank among current state-of-the-art methods for probabilistic program inference. A drawback of these techniques is that they rely on importance resampling, which results in degenerate particle trajectories and a low effective sample size for variables sampled early in a program. We here develop a formalism to adapt ancestor resampling, a technique that mitigates particle degeneracy, to the probabilistic programming setting. We present empirical results that demonstrate nontrivial performance gains.