Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC

Priyank Jaini, Didrik Nielsen, Max Welling
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:3349-3357, 2021.

Abstract

Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions. However, a major limitation of HMC is its inability to be applied to discrete domains due to the lack of gradient signal. In this work, we introduce a new approach based on augmenting monte carlo methods with SurVAE Flow to sample from discrete distributions using a combination of neural transport methods like normalizing flows, variational dequantization, and the Metropolis-Hastings rule. Our method first learns a continuous embedding of the discrete space using a surjective map and subsequently learns a bijective transformation from the continuous space to an approximately Gaussian distributed latent variable. Sampling proceeds by simulating MCMC chains in the latent space and mapping these samples to the target discrete space via the learned transformations. We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics, and, machine learning, and observe improvements compared to alternative algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-jaini21a, title = { Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC }, author = {Jaini, Priyank and Nielsen, Didrik and Welling, Max}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {3349--3357}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/jaini21a/jaini21a.pdf}, url = {https://proceedings.mlr.press/v130/jaini21a.html}, abstract = { Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions. However, a major limitation of HMC is its inability to be applied to discrete domains due to the lack of gradient signal. In this work, we introduce a new approach based on augmenting monte carlo methods with SurVAE Flow to sample from discrete distributions using a combination of neural transport methods like normalizing flows, variational dequantization, and the Metropolis-Hastings rule. Our method first learns a continuous embedding of the discrete space using a surjective map and subsequently learns a bijective transformation from the continuous space to an approximately Gaussian distributed latent variable. Sampling proceeds by simulating MCMC chains in the latent space and mapping these samples to the target discrete space via the learned transformations. We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics, and, machine learning, and observe improvements compared to alternative algorithms. } }
Endnote
%0 Conference Paper %T Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC %A Priyank Jaini %A Didrik Nielsen %A Max Welling %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-jaini21a %I PMLR %P 3349--3357 %U https://proceedings.mlr.press/v130/jaini21a.html %V 130 %X Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions. However, a major limitation of HMC is its inability to be applied to discrete domains due to the lack of gradient signal. In this work, we introduce a new approach based on augmenting monte carlo methods with SurVAE Flow to sample from discrete distributions using a combination of neural transport methods like normalizing flows, variational dequantization, and the Metropolis-Hastings rule. Our method first learns a continuous embedding of the discrete space using a surjective map and subsequently learns a bijective transformation from the continuous space to an approximately Gaussian distributed latent variable. Sampling proceeds by simulating MCMC chains in the latent space and mapping these samples to the target discrete space via the learned transformations. We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics, and, machine learning, and observe improvements compared to alternative algorithms.
APA
Jaini, P., Nielsen, D. & Welling, M.. (2021). Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:3349-3357 Available from https://proceedings.mlr.press/v130/jaini21a.html.

Related Material