Geometric No-U-Turn Samplers: Concepts and Evaluation

Bernardo Williams, Hanlin Yu, Marcelo Hartmann, Arto Klami
Proceedings of The 12th International Conference on Probabilistic Graphical Models, PMLR 246:327-347, 2024.

Abstract

We enhance geometric Markov Chain Monte Carlo methods, in particular making them easier to use by providing better tools for choosing the metric and various tuning parameters. We extend the No-U-Turn criterion for automatic choice of integration length for Lagrangian Monte Carlo and propose a modification to the computationally efficient Monge metric, as well as summarizing several previously proposed metric choices. Through extensive experimentation, including synthetic examples and posteriordb benchmarks, we demonstrate that Riemannian metrics can outperform Euclidean counterparts, particularly in scenarios with high curvature, while highlighting how the optimal choice of metric is problem-specific.

Cite this Paper


BibTeX
@InProceedings{pmlr-v246-williams24a, title = {Geometric No-U-Turn Samplers: Concepts and Evaluation}, author = {Williams, Bernardo and Yu, Hanlin and Hartmann, Marcelo and Klami, Arto}, booktitle = {Proceedings of The 12th International Conference on Probabilistic Graphical Models}, pages = {327--347}, year = {2024}, editor = {Kwisthout, Johan and Renooij, Silja}, volume = {246}, series = {Proceedings of Machine Learning Research}, month = {11--13 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v246/main/assets/williams24a/williams24a.pdf}, url = {https://proceedings.mlr.press/v246/williams24a.html}, abstract = {We enhance geometric Markov Chain Monte Carlo methods, in particular making them easier to use by providing better tools for choosing the metric and various tuning parameters. We extend the No-U-Turn criterion for automatic choice of integration length for Lagrangian Monte Carlo and propose a modification to the computationally efficient Monge metric, as well as summarizing several previously proposed metric choices. Through extensive experimentation, including synthetic examples and posteriordb benchmarks, we demonstrate that Riemannian metrics can outperform Euclidean counterparts, particularly in scenarios with high curvature, while highlighting how the optimal choice of metric is problem-specific.} }
Endnote
%0 Conference Paper %T Geometric No-U-Turn Samplers: Concepts and Evaluation %A Bernardo Williams %A Hanlin Yu %A Marcelo Hartmann %A Arto Klami %B Proceedings of The 12th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2024 %E Johan Kwisthout %E Silja Renooij %F pmlr-v246-williams24a %I PMLR %P 327--347 %U https://proceedings.mlr.press/v246/williams24a.html %V 246 %X We enhance geometric Markov Chain Monte Carlo methods, in particular making them easier to use by providing better tools for choosing the metric and various tuning parameters. We extend the No-U-Turn criterion for automatic choice of integration length for Lagrangian Monte Carlo and propose a modification to the computationally efficient Monge metric, as well as summarizing several previously proposed metric choices. Through extensive experimentation, including synthetic examples and posteriordb benchmarks, we demonstrate that Riemannian metrics can outperform Euclidean counterparts, particularly in scenarios with high curvature, while highlighting how the optimal choice of metric is problem-specific.
APA
Williams, B., Yu, H., Hartmann, M. & Klami, A.. (2024). Geometric No-U-Turn Samplers: Concepts and Evaluation. Proceedings of The 12th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 246:327-347 Available from https://proceedings.mlr.press/v246/williams24a.html.

Related Material