A connection between Tempering and Entropic Mirror Descent

Nicolas Chopin, Francesca Crucinio, Anna Korba
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:8782-8800, 2024.

Abstract

This paper explores the connections between tempering (for Sequential Monte Carlo; SMC) and entropic mirror descent to sample from a target probability distribution whose unnormalized density is known. We establish that tempering SMC corresponds to entropic mirror descent applied to the reverse Kullback-Leibler (KL) divergence and obtain convergence rates for the tempering iterates. Our result motivates the tempering iterates from an optimization point of view, showing that tempering can be seen as a descent scheme of the KL divergence with respect to the Fisher-Rao geometry, in contrast to Langevin dynamics that perform descent of the KL with respect to the Wasserstein-2 geometry. We exploit the connection between tempering and mirror descent iterates to justify common practices in SMC and derive adaptive tempering rules that improve over other alternative benchmarks in the literature.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-chopin24a, title = {A connection between Tempering and Entropic Mirror Descent}, author = {Chopin, Nicolas and Crucinio, Francesca and Korba, Anna}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {8782--8800}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/chopin24a/chopin24a.pdf}, url = {https://proceedings.mlr.press/v235/chopin24a.html}, abstract = {This paper explores the connections between tempering (for Sequential Monte Carlo; SMC) and entropic mirror descent to sample from a target probability distribution whose unnormalized density is known. We establish that tempering SMC corresponds to entropic mirror descent applied to the reverse Kullback-Leibler (KL) divergence and obtain convergence rates for the tempering iterates. Our result motivates the tempering iterates from an optimization point of view, showing that tempering can be seen as a descent scheme of the KL divergence with respect to the Fisher-Rao geometry, in contrast to Langevin dynamics that perform descent of the KL with respect to the Wasserstein-2 geometry. We exploit the connection between tempering and mirror descent iterates to justify common practices in SMC and derive adaptive tempering rules that improve over other alternative benchmarks in the literature.} }
Endnote
%0 Conference Paper %T A connection between Tempering and Entropic Mirror Descent %A Nicolas Chopin %A Francesca Crucinio %A Anna Korba %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-chopin24a %I PMLR %P 8782--8800 %U https://proceedings.mlr.press/v235/chopin24a.html %V 235 %X This paper explores the connections between tempering (for Sequential Monte Carlo; SMC) and entropic mirror descent to sample from a target probability distribution whose unnormalized density is known. We establish that tempering SMC corresponds to entropic mirror descent applied to the reverse Kullback-Leibler (KL) divergence and obtain convergence rates for the tempering iterates. Our result motivates the tempering iterates from an optimization point of view, showing that tempering can be seen as a descent scheme of the KL divergence with respect to the Fisher-Rao geometry, in contrast to Langevin dynamics that perform descent of the KL with respect to the Wasserstein-2 geometry. We exploit the connection between tempering and mirror descent iterates to justify common practices in SMC and derive adaptive tempering rules that improve over other alternative benchmarks in the literature.
APA
Chopin, N., Crucinio, F. & Korba, A.. (2024). A connection between Tempering and Entropic Mirror Descent. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:8782-8800 Available from https://proceedings.mlr.press/v235/chopin24a.html.

Related Material