Geodesically-convex optimization for averaging partially observed covariance matrices

Florian Yger, Sylvain Chevallier, Quentin Barthélemy, Suvrit Sra
Proceedings of The 12th Asian Conference on Machine Learning, PMLR 129:417-432, 2020.

Abstract

Symmetric positive definite (SPD) matrices permeates numerous scientific disciplines, including machine learning, optimization, and signal processing. Equipped with a Riemannian geometry, the space of SPD matrices benefits from compelling properties and its derived Riemannian mean is now the gold standard in some applications, e.g. brain-computer interfaces (BCI). This paper addresses the problem of averaging covariance matrices with missing variables. This situation often occurs with inexpensive or unreliable sensors, or when artifact-suppression techniques remove corrupted sensors leading to rank deficient matrices, hindering the use of the Riemannian geometry in covariance-based approaches. An alternate but questionable method consists in removing the matrices with missing variables, thus reducing the training set size. We address those limitations and propose a new formulation grounded in geodesic convexity. Our approach is evaluated on generated datasets with a controlled number of missing variables and a known baseline, demonstrating the robustness of the proposed estimator. The practical interest of this approach is assessed on real BCI datasets. Our results show that the proposed average is more robust and better suited for classification than classical data imputation methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v129-yger20a, title = {Geodesically-convex optimization for averaging partially observed covariance matrices}, author = {Yger, Florian and Chevallier, Sylvain and Barth\'elemy, Quentin and Sra, Suvrit}, booktitle = {Proceedings of The 12th Asian Conference on Machine Learning}, pages = {417--432}, year = {2020}, editor = {Pan, Sinno Jialin and Sugiyama, Masashi}, volume = {129}, series = {Proceedings of Machine Learning Research}, month = {18--20 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v129/yger20a/yger20a.pdf}, url = {https://proceedings.mlr.press/v129/yger20a.html}, abstract = {Symmetric positive definite (SPD) matrices permeates numerous scientific disciplines, including machine learning, optimization, and signal processing. Equipped with a Riemannian geometry, the space of SPD matrices benefits from compelling properties and its derived Riemannian mean is now the gold standard in some applications, e.g. brain-computer interfaces (BCI). This paper addresses the problem of averaging covariance matrices with missing variables. This situation often occurs with inexpensive or unreliable sensors, or when artifact-suppression techniques remove corrupted sensors leading to rank deficient matrices, hindering the use of the Riemannian geometry in covariance-based approaches. An alternate but questionable method consists in removing the matrices with missing variables, thus reducing the training set size. We address those limitations and propose a new formulation grounded in geodesic convexity. Our approach is evaluated on generated datasets with a controlled number of missing variables and a known baseline, demonstrating the robustness of the proposed estimator. The practical interest of this approach is assessed on real BCI datasets. Our results show that the proposed average is more robust and better suited for classification than classical data imputation methods.} }
Endnote
%0 Conference Paper %T Geodesically-convex optimization for averaging partially observed covariance matrices %A Florian Yger %A Sylvain Chevallier %A Quentin Barthélemy %A Suvrit Sra %B Proceedings of The 12th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Sinno Jialin Pan %E Masashi Sugiyama %F pmlr-v129-yger20a %I PMLR %P 417--432 %U https://proceedings.mlr.press/v129/yger20a.html %V 129 %X Symmetric positive definite (SPD) matrices permeates numerous scientific disciplines, including machine learning, optimization, and signal processing. Equipped with a Riemannian geometry, the space of SPD matrices benefits from compelling properties and its derived Riemannian mean is now the gold standard in some applications, e.g. brain-computer interfaces (BCI). This paper addresses the problem of averaging covariance matrices with missing variables. This situation often occurs with inexpensive or unreliable sensors, or when artifact-suppression techniques remove corrupted sensors leading to rank deficient matrices, hindering the use of the Riemannian geometry in covariance-based approaches. An alternate but questionable method consists in removing the matrices with missing variables, thus reducing the training set size. We address those limitations and propose a new formulation grounded in geodesic convexity. Our approach is evaluated on generated datasets with a controlled number of missing variables and a known baseline, demonstrating the robustness of the proposed estimator. The practical interest of this approach is assessed on real BCI datasets. Our results show that the proposed average is more robust and better suited for classification than classical data imputation methods.
APA
Yger, F., Chevallier, S., Barthélemy, Q. & Sra, S.. (2020). Geodesically-convex optimization for averaging partially observed covariance matrices. Proceedings of The 12th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 129:417-432 Available from https://proceedings.mlr.press/v129/yger20a.html.

Related Material