Bayesian Quadrature on Riemannian Data Manifolds

Christian Fröhlich, Alexandra Gessner, Philipp Hennig, Bernhard Schölkopf, Georgios Arvanitidis
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:3459-3468, 2021.

Abstract

Riemannian manifolds provide a principled way to model nonlinear geometric structure inherent in data. A Riemannian metric on said manifolds determines geometry-aware shortest paths and provides the means to define statistical models accordingly. However, these operations are typically computationally demanding. To ease this computational burden, we advocate probabilistic numerical methods for Riemannian statistics. In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws on Riemannian manifolds learned from data. In this task, each function evaluation relies on the solution of an expensive initial value problem. We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations and thus outperforms Monte Carlo methods on a wide range of integration problems. As a concrete application, we highlight the merits of adopting Riemannian geometry with our proposed framework on a nonlinear dataset from molecular dynamics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-frohlich21a, title = {Bayesian Quadrature on Riemannian Data Manifolds}, author = {Fr{\"o}hlich, Christian and Gessner, Alexandra and Hennig, Philipp and Sch{\"o}lkopf, Bernhard and Arvanitidis, Georgios}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {3459--3468}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/frohlich21a/frohlich21a.pdf}, url = {https://proceedings.mlr.press/v139/frohlich21a.html}, abstract = {Riemannian manifolds provide a principled way to model nonlinear geometric structure inherent in data. A Riemannian metric on said manifolds determines geometry-aware shortest paths and provides the means to define statistical models accordingly. However, these operations are typically computationally demanding. To ease this computational burden, we advocate probabilistic numerical methods for Riemannian statistics. In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws on Riemannian manifolds learned from data. In this task, each function evaluation relies on the solution of an expensive initial value problem. We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations and thus outperforms Monte Carlo methods on a wide range of integration problems. As a concrete application, we highlight the merits of adopting Riemannian geometry with our proposed framework on a nonlinear dataset from molecular dynamics.} }
Endnote
%0 Conference Paper %T Bayesian Quadrature on Riemannian Data Manifolds %A Christian Fröhlich %A Alexandra Gessner %A Philipp Hennig %A Bernhard Schölkopf %A Georgios Arvanitidis %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-frohlich21a %I PMLR %P 3459--3468 %U https://proceedings.mlr.press/v139/frohlich21a.html %V 139 %X Riemannian manifolds provide a principled way to model nonlinear geometric structure inherent in data. A Riemannian metric on said manifolds determines geometry-aware shortest paths and provides the means to define statistical models accordingly. However, these operations are typically computationally demanding. To ease this computational burden, we advocate probabilistic numerical methods for Riemannian statistics. In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws on Riemannian manifolds learned from data. In this task, each function evaluation relies on the solution of an expensive initial value problem. We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations and thus outperforms Monte Carlo methods on a wide range of integration problems. As a concrete application, we highlight the merits of adopting Riemannian geometry with our proposed framework on a nonlinear dataset from molecular dynamics.
APA
Fröhlich, C., Gessner, A., Hennig, P., Schölkopf, B. & Arvanitidis, G.. (2021). Bayesian Quadrature on Riemannian Data Manifolds. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:3459-3468 Available from https://proceedings.mlr.press/v139/frohlich21a.html.

Related Material