Wrapped Gaussian on the manifold of Symmetric Positive Definite Matrices

Thibault De Surrel, Fabien Lotte, Sylvain Chevallier, Florian Yger
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:12816-12839, 2025.

Abstract

Circular and non-flat data distribution are prevalent across diverse domains of data science, yet their specific geometric structures often remain underutilized in machine learning frameworks. A principled approach to accounting for the underlying geometry of such data is pivotal, particularly when extending statistical models, like the pervasive Gaussian distribution. In this work, we tackle those issue by focusing on the manifold of symmetric positive definite matrices, a key focus in information geometry. We introduced a non-isotropic wrapped Gaussian by leveraging the exponential map, we derive theoretical properties of this distribution and propose a maximum likelihood framework for parameter estimation. Furthermore, we reinterpret established classifiers on SPD through a probabilistic lens and introduce new classifiers based on the wrapped Gaussian model. Experiments on synthetic and real-world datasets demonstrate the robustness and flexibility of this geometry-aware distribution, underscoring its potential to advance manifold-based data analysis. This work lays the groundwork for extending classical machine learning and statistical methods to more complex and structured data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-de-surrel25a, title = {Wrapped {G}aussian on the manifold of Symmetric Positive Definite Matrices}, author = {De Surrel, Thibault and Lotte, Fabien and Chevallier, Sylvain and Yger, Florian}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {12816--12839}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/de-surrel25a/de-surrel25a.pdf}, url = {https://proceedings.mlr.press/v267/de-surrel25a.html}, abstract = {Circular and non-flat data distribution are prevalent across diverse domains of data science, yet their specific geometric structures often remain underutilized in machine learning frameworks. A principled approach to accounting for the underlying geometry of such data is pivotal, particularly when extending statistical models, like the pervasive Gaussian distribution. In this work, we tackle those issue by focusing on the manifold of symmetric positive definite matrices, a key focus in information geometry. We introduced a non-isotropic wrapped Gaussian by leveraging the exponential map, we derive theoretical properties of this distribution and propose a maximum likelihood framework for parameter estimation. Furthermore, we reinterpret established classifiers on SPD through a probabilistic lens and introduce new classifiers based on the wrapped Gaussian model. Experiments on synthetic and real-world datasets demonstrate the robustness and flexibility of this geometry-aware distribution, underscoring its potential to advance manifold-based data analysis. This work lays the groundwork for extending classical machine learning and statistical methods to more complex and structured data.} }
Endnote
%0 Conference Paper %T Wrapped Gaussian on the manifold of Symmetric Positive Definite Matrices %A Thibault De Surrel %A Fabien Lotte %A Sylvain Chevallier %A Florian Yger %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-de-surrel25a %I PMLR %P 12816--12839 %U https://proceedings.mlr.press/v267/de-surrel25a.html %V 267 %X Circular and non-flat data distribution are prevalent across diverse domains of data science, yet their specific geometric structures often remain underutilized in machine learning frameworks. A principled approach to accounting for the underlying geometry of such data is pivotal, particularly when extending statistical models, like the pervasive Gaussian distribution. In this work, we tackle those issue by focusing on the manifold of symmetric positive definite matrices, a key focus in information geometry. We introduced a non-isotropic wrapped Gaussian by leveraging the exponential map, we derive theoretical properties of this distribution and propose a maximum likelihood framework for parameter estimation. Furthermore, we reinterpret established classifiers on SPD through a probabilistic lens and introduce new classifiers based on the wrapped Gaussian model. Experiments on synthetic and real-world datasets demonstrate the robustness and flexibility of this geometry-aware distribution, underscoring its potential to advance manifold-based data analysis. This work lays the groundwork for extending classical machine learning and statistical methods to more complex and structured data.
APA
De Surrel, T., Lotte, F., Chevallier, S. & Yger, F.. (2025). Wrapped Gaussian on the manifold of Symmetric Positive Definite Matrices. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:12816-12839 Available from https://proceedings.mlr.press/v267/de-surrel25a.html.

Related Material