Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals

James Brofos, Marylou Gabrie, Marcus A. Brubaker, Roy R. Lederman
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:5949-5986, 2022.

Abstract

Markov Chain Monte Carlo (MCMC) methods are a powerful tool for computation with complex probability distributions. However the performance of such methods is critically dependent on properly tuned parameters, most of which are difficult if not impossible to know a priori for a given target distribution. Adaptive MCMC methods aim to address this by allowing the parameters to be updated during sampling based on previous samples from the chain at the expense of requiring a new theoretical analysis to ensure convergence. In this work we extend the convergence theory of adaptive MCMC methods to a new class of methods built on a powerful class of parametric density estimators known as normalizing flows. In particular, we consider an independent Metropolis-Hastings sampler where the proposal distribution is represented by a normalizing flow whose parameters are updated using stochastic gradient descent. We explore the practical performance of this procedure on both synthetic settings and in the analysis of a physical field system, and compare it against both adaptive and non-adaptive MCMC methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-brofos22a, title = { Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals }, author = {Brofos, James and Gabrie, Marylou and Brubaker, Marcus A. and Lederman, Roy R.}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {5949--5986}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/brofos22a/brofos22a.pdf}, url = {https://proceedings.mlr.press/v151/brofos22a.html}, abstract = { Markov Chain Monte Carlo (MCMC) methods are a powerful tool for computation with complex probability distributions. However the performance of such methods is critically dependent on properly tuned parameters, most of which are difficult if not impossible to know a priori for a given target distribution. Adaptive MCMC methods aim to address this by allowing the parameters to be updated during sampling based on previous samples from the chain at the expense of requiring a new theoretical analysis to ensure convergence. In this work we extend the convergence theory of adaptive MCMC methods to a new class of methods built on a powerful class of parametric density estimators known as normalizing flows. In particular, we consider an independent Metropolis-Hastings sampler where the proposal distribution is represented by a normalizing flow whose parameters are updated using stochastic gradient descent. We explore the practical performance of this procedure on both synthetic settings and in the analysis of a physical field system, and compare it against both adaptive and non-adaptive MCMC methods. } }
Endnote
%0 Conference Paper %T Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals %A James Brofos %A Marylou Gabrie %A Marcus A. Brubaker %A Roy R. Lederman %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-brofos22a %I PMLR %P 5949--5986 %U https://proceedings.mlr.press/v151/brofos22a.html %V 151 %X Markov Chain Monte Carlo (MCMC) methods are a powerful tool for computation with complex probability distributions. However the performance of such methods is critically dependent on properly tuned parameters, most of which are difficult if not impossible to know a priori for a given target distribution. Adaptive MCMC methods aim to address this by allowing the parameters to be updated during sampling based on previous samples from the chain at the expense of requiring a new theoretical analysis to ensure convergence. In this work we extend the convergence theory of adaptive MCMC methods to a new class of methods built on a powerful class of parametric density estimators known as normalizing flows. In particular, we consider an independent Metropolis-Hastings sampler where the proposal distribution is represented by a normalizing flow whose parameters are updated using stochastic gradient descent. We explore the practical performance of this procedure on both synthetic settings and in the analysis of a physical field system, and compare it against both adaptive and non-adaptive MCMC methods.
APA
Brofos, J., Gabrie, M., Brubaker, M.A. & Lederman, R.R.. (2022). Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:5949-5986 Available from https://proceedings.mlr.press/v151/brofos22a.html.

Related Material