Riemannian Laplace Approximation with the Fisher Metric

Hanlin Yu, Marcelo Hartmann, Bernardo Williams Moreno Sanchez, Mark Girolami, Arto Klami
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:820-828, 2024.

Abstract

Laplace’s method approximates a target density with a Gaussian distribution at its mode. It is computationally efficient and asymptotically exact for Bayesian inference due to the Bernstein-von Mises theorem, but for complex targets and finite-data posteriors it is often too crude an approximation. A recent generalization of the Laplace Approximation transforms the Gaussian approximation according to a chosen Riemannian geometry providing a richer approximation family, while still retaining computational efficiency. However, as shown here, its properties depend heavily on the chosen metric, indeed the metric adopted in previous work results in approximations that are overly narrow as well as being biased even at the limit of infinite data. We correct this shortcoming by developing the approximation family further, deriving two alternative variants that are exact at the limit of infinite data, extending the theoretical analysis of the method, and demonstrating practical improvements in a range of experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-yu24a, title = { Riemannian {L}aplace Approximation with the {F}isher Metric }, author = {Yu, Hanlin and Hartmann, Marcelo and Williams Moreno Sanchez, Bernardo and Girolami, Mark and Klami, Arto}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {820--828}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/yu24a/yu24a.pdf}, url = {https://proceedings.mlr.press/v238/yu24a.html}, abstract = { Laplace’s method approximates a target density with a Gaussian distribution at its mode. It is computationally efficient and asymptotically exact for Bayesian inference due to the Bernstein-von Mises theorem, but for complex targets and finite-data posteriors it is often too crude an approximation. A recent generalization of the Laplace Approximation transforms the Gaussian approximation according to a chosen Riemannian geometry providing a richer approximation family, while still retaining computational efficiency. However, as shown here, its properties depend heavily on the chosen metric, indeed the metric adopted in previous work results in approximations that are overly narrow as well as being biased even at the limit of infinite data. We correct this shortcoming by developing the approximation family further, deriving two alternative variants that are exact at the limit of infinite data, extending the theoretical analysis of the method, and demonstrating practical improvements in a range of experiments. } }
Endnote
%0 Conference Paper %T Riemannian Laplace Approximation with the Fisher Metric %A Hanlin Yu %A Marcelo Hartmann %A Bernardo Williams Moreno Sanchez %A Mark Girolami %A Arto Klami %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-yu24a %I PMLR %P 820--828 %U https://proceedings.mlr.press/v238/yu24a.html %V 238 %X Laplace’s method approximates a target density with a Gaussian distribution at its mode. It is computationally efficient and asymptotically exact for Bayesian inference due to the Bernstein-von Mises theorem, but for complex targets and finite-data posteriors it is often too crude an approximation. A recent generalization of the Laplace Approximation transforms the Gaussian approximation according to a chosen Riemannian geometry providing a richer approximation family, while still retaining computational efficiency. However, as shown here, its properties depend heavily on the chosen metric, indeed the metric adopted in previous work results in approximations that are overly narrow as well as being biased even at the limit of infinite data. We correct this shortcoming by developing the approximation family further, deriving two alternative variants that are exact at the limit of infinite data, extending the theoretical analysis of the method, and demonstrating practical improvements in a range of experiments.
APA
Yu, H., Hartmann, M., Williams Moreno Sanchez, B., Girolami, M. & Klami, A.. (2024). Riemannian Laplace Approximation with the Fisher Metric . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:820-828 Available from https://proceedings.mlr.press/v238/yu24a.html.

Related Material