Adaptive Annealed Importance Sampling with Constant Rate Progress

Shirin Goshtasbpour, Victor Cohen, Fernando Perez-Cruz
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:11642-11658, 2023.

Abstract

Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution given its unnormalized density function. This algorithm relies on a sequence of interpolating distributions bridging the target to an initial tractable distribution such as the well-known geometric mean path of unnormalized distributions which is assumed to be suboptimal in general. In this paper, we prove that the geometric annealing corresponds to the distribution path that minimizes the KL divergence between the current particle distribution and the desired target when the feasible change in the particle distribution is constrained. Following this observation, we derive the constant rate discretization schedule for this annealing sequence, which adjusts the schedule to the difficulty of moving samples between the initial and the target distributions. We further extend our results to $f$-divergences and present the respective dynamics of annealing sequences based on which we propose the Constant Rate AIS (CR-AIS) algorithm and its efficient implementation for $\alpha$-divergences. We empirically show that CR-AIS performs well on multiple benchmark distributions while avoiding the computationally expensive tuning loop in existing Adaptive AIS.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-goshtasbpour23a, title = {Adaptive Annealed Importance Sampling with Constant Rate Progress}, author = {Goshtasbpour, Shirin and Cohen, Victor and Perez-Cruz, Fernando}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {11642--11658}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/goshtasbpour23a/goshtasbpour23a.pdf}, url = {https://proceedings.mlr.press/v202/goshtasbpour23a.html}, abstract = {Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution given its unnormalized density function. This algorithm relies on a sequence of interpolating distributions bridging the target to an initial tractable distribution such as the well-known geometric mean path of unnormalized distributions which is assumed to be suboptimal in general. In this paper, we prove that the geometric annealing corresponds to the distribution path that minimizes the KL divergence between the current particle distribution and the desired target when the feasible change in the particle distribution is constrained. Following this observation, we derive the constant rate discretization schedule for this annealing sequence, which adjusts the schedule to the difficulty of moving samples between the initial and the target distributions. We further extend our results to $f$-divergences and present the respective dynamics of annealing sequences based on which we propose the Constant Rate AIS (CR-AIS) algorithm and its efficient implementation for $\alpha$-divergences. We empirically show that CR-AIS performs well on multiple benchmark distributions while avoiding the computationally expensive tuning loop in existing Adaptive AIS.} }
Endnote
%0 Conference Paper %T Adaptive Annealed Importance Sampling with Constant Rate Progress %A Shirin Goshtasbpour %A Victor Cohen %A Fernando Perez-Cruz %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-goshtasbpour23a %I PMLR %P 11642--11658 %U https://proceedings.mlr.press/v202/goshtasbpour23a.html %V 202 %X Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution given its unnormalized density function. This algorithm relies on a sequence of interpolating distributions bridging the target to an initial tractable distribution such as the well-known geometric mean path of unnormalized distributions which is assumed to be suboptimal in general. In this paper, we prove that the geometric annealing corresponds to the distribution path that minimizes the KL divergence between the current particle distribution and the desired target when the feasible change in the particle distribution is constrained. Following this observation, we derive the constant rate discretization schedule for this annealing sequence, which adjusts the schedule to the difficulty of moving samples between the initial and the target distributions. We further extend our results to $f$-divergences and present the respective dynamics of annealing sequences based on which we propose the Constant Rate AIS (CR-AIS) algorithm and its efficient implementation for $\alpha$-divergences. We empirically show that CR-AIS performs well on multiple benchmark distributions while avoiding the computationally expensive tuning loop in existing Adaptive AIS.
APA
Goshtasbpour, S., Cohen, V. & Perez-Cruz, F.. (2023). Adaptive Annealed Importance Sampling with Constant Rate Progress. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:11642-11658 Available from https://proceedings.mlr.press/v202/goshtasbpour23a.html.

Related Material