Variational Russian Roulette for Deep Bayesian Nonparametrics
[edit]
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:69636972, 2019.
Abstract
Bayesian nonparametric models provide a principled way to automatically adapt the complexity of a model to the amount of the data available, but computation in such models is difficult. Amortized variational approximations are appealing because of their computational efficiency, but current methods rely on a fixed finite truncation of the infinite model. This truncation level can be difficult to set, and also interacts poorly with amortized methods due to the overpruning problem. Instead, we propose a new variational approximation, based on a method from statistical physics called Russian roulette sampling. This allows the variational distribution to adapt its complexity during inference, without relying on a fixed truncation level, and while still obtaining an unbiased estimate of the gradient of the original variational objective. We demonstrate this method on infinite sized variational autoencoders using a BetaBernoulli (Indian buffet process) prior.
Related Material


