MixFlows: principled variational inference via mixed flows

Zuheng Xu, Naitong Chen, Trevor Campbell
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:38342-38376, 2023.

Abstract

This work presents mixed variational flows (MixFlows), a new variational family that consists of a mixture of repeated applications of a map to an initial reference distribution. First, we provide efficient algorithms for i.i.d. sampling, density evaluation, and unbiased ELBO estimation. We then show that MixFlows have MCMC-like convergence guarantees when the flow map is ergodic and measure-preserving, and provide bounds on the accumulation of error for practical implementations where the flow map is approximated. Finally, we develop an implementation of MixFlows based on uncorrected discretized Hamiltonian dynamics combined with deterministic momentum refreshment. Simulated and real data experiments show that MixFlows can provide more reliable posterior approximations than several black-box normalizing flows, as well as samples of comparable quality to those obtained from state-of-the-art MCMC methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-xu23b, title = {{M}ix{F}lows: principled variational inference via mixed flows}, author = {Xu, Zuheng and Chen, Naitong and Campbell, Trevor}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {38342--38376}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/xu23b/xu23b.pdf}, url = {https://proceedings.mlr.press/v202/xu23b.html}, abstract = {This work presents mixed variational flows (MixFlows), a new variational family that consists of a mixture of repeated applications of a map to an initial reference distribution. First, we provide efficient algorithms for i.i.d. sampling, density evaluation, and unbiased ELBO estimation. We then show that MixFlows have MCMC-like convergence guarantees when the flow map is ergodic and measure-preserving, and provide bounds on the accumulation of error for practical implementations where the flow map is approximated. Finally, we develop an implementation of MixFlows based on uncorrected discretized Hamiltonian dynamics combined with deterministic momentum refreshment. Simulated and real data experiments show that MixFlows can provide more reliable posterior approximations than several black-box normalizing flows, as well as samples of comparable quality to those obtained from state-of-the-art MCMC methods.} }
Endnote
%0 Conference Paper %T MixFlows: principled variational inference via mixed flows %A Zuheng Xu %A Naitong Chen %A Trevor Campbell %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-xu23b %I PMLR %P 38342--38376 %U https://proceedings.mlr.press/v202/xu23b.html %V 202 %X This work presents mixed variational flows (MixFlows), a new variational family that consists of a mixture of repeated applications of a map to an initial reference distribution. First, we provide efficient algorithms for i.i.d. sampling, density evaluation, and unbiased ELBO estimation. We then show that MixFlows have MCMC-like convergence guarantees when the flow map is ergodic and measure-preserving, and provide bounds on the accumulation of error for practical implementations where the flow map is approximated. Finally, we develop an implementation of MixFlows based on uncorrected discretized Hamiltonian dynamics combined with deterministic momentum refreshment. Simulated and real data experiments show that MixFlows can provide more reliable posterior approximations than several black-box normalizing flows, as well as samples of comparable quality to those obtained from state-of-the-art MCMC methods.
APA
Xu, Z., Chen, N. & Campbell, T.. (2023). MixFlows: principled variational inference via mixed flows. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:38342-38376 Available from https://proceedings.mlr.press/v202/xu23b.html.

Related Material