Geometry-Aware Principal Component Analysis for Symmetric Positive Definite Matrices

Inbal Horev, Florian Yger, Masashi Sugiyama
Asian Conference on Machine Learning, PMLR 45:1-16, 2016.

Abstract

Symmetric positive definite (SPD) matrices, e.g. covariance matrices, are ubiquitous in machine learning applications. However, because their size grows as n^2 (where n is the number of variables) their high-dimensionality is a crucial point when working with them. Thus, it is often useful to apply to them dimensionality reduction techniques. Principal component analysis (PCA) is a canonical tool for dimensionality reduction, which for vector data reduces the dimension of the input data while maximizing the preserved variance. Yet, the commonly used, naive extensions of PCA to matrices result in sub-optimal variance retention. Moreover, when applied to SPD matrices, they ignore the geometric structure of the space of SPD matrices, further degrading the performance. In this paper we develop a new Riemannian geometry based formulation of PCA for SPD matrices that i) preserves more data variance by appropriately extending PCA to matrix data, and ii) extends the standard definition from the Euclidean to the Riemannian geometries. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals.

Cite this Paper


BibTeX
@InProceedings{pmlr-v45-Horev15, title = {Geometry-Aware Principal Component Analysis for Symmetric Positive Definite Matrices}, author = {Horev, Inbal and Yger, Florian and Sugiyama, Masashi}, booktitle = {Asian Conference on Machine Learning}, pages = {1--16}, year = {2016}, editor = {Holmes, Geoffrey and Liu, Tie-Yan}, volume = {45}, series = {Proceedings of Machine Learning Research}, address = {Hong Kong}, month = {20--22 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v45/Horev15.pdf}, url = {https://proceedings.mlr.press/v45/Horev15.html}, abstract = {Symmetric positive definite (SPD) matrices, e.g. covariance matrices, are ubiquitous in machine learning applications. However, because their size grows as n^2 (where n is the number of variables) their high-dimensionality is a crucial point when working with them. Thus, it is often useful to apply to them dimensionality reduction techniques. Principal component analysis (PCA) is a canonical tool for dimensionality reduction, which for vector data reduces the dimension of the input data while maximizing the preserved variance. Yet, the commonly used, naive extensions of PCA to matrices result in sub-optimal variance retention. Moreover, when applied to SPD matrices, they ignore the geometric structure of the space of SPD matrices, further degrading the performance. In this paper we develop a new Riemannian geometry based formulation of PCA for SPD matrices that i) preserves more data variance by appropriately extending PCA to matrix data, and ii) extends the standard definition from the Euclidean to the Riemannian geometries. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals.} }
Endnote
%0 Conference Paper %T Geometry-Aware Principal Component Analysis for Symmetric Positive Definite Matrices %A Inbal Horev %A Florian Yger %A Masashi Sugiyama %B Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Geoffrey Holmes %E Tie-Yan Liu %F pmlr-v45-Horev15 %I PMLR %P 1--16 %U https://proceedings.mlr.press/v45/Horev15.html %V 45 %X Symmetric positive definite (SPD) matrices, e.g. covariance matrices, are ubiquitous in machine learning applications. However, because their size grows as n^2 (where n is the number of variables) their high-dimensionality is a crucial point when working with them. Thus, it is often useful to apply to them dimensionality reduction techniques. Principal component analysis (PCA) is a canonical tool for dimensionality reduction, which for vector data reduces the dimension of the input data while maximizing the preserved variance. Yet, the commonly used, naive extensions of PCA to matrices result in sub-optimal variance retention. Moreover, when applied to SPD matrices, they ignore the geometric structure of the space of SPD matrices, further degrading the performance. In this paper we develop a new Riemannian geometry based formulation of PCA for SPD matrices that i) preserves more data variance by appropriately extending PCA to matrix data, and ii) extends the standard definition from the Euclidean to the Riemannian geometries. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals.
RIS
TY - CPAPER TI - Geometry-Aware Principal Component Analysis for Symmetric Positive Definite Matrices AU - Inbal Horev AU - Florian Yger AU - Masashi Sugiyama BT - Asian Conference on Machine Learning DA - 2016/02/25 ED - Geoffrey Holmes ED - Tie-Yan Liu ID - pmlr-v45-Horev15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 45 SP - 1 EP - 16 L1 - http://proceedings.mlr.press/v45/Horev15.pdf UR - https://proceedings.mlr.press/v45/Horev15.html AB - Symmetric positive definite (SPD) matrices, e.g. covariance matrices, are ubiquitous in machine learning applications. However, because their size grows as n^2 (where n is the number of variables) their high-dimensionality is a crucial point when working with them. Thus, it is often useful to apply to them dimensionality reduction techniques. Principal component analysis (PCA) is a canonical tool for dimensionality reduction, which for vector data reduces the dimension of the input data while maximizing the preserved variance. Yet, the commonly used, naive extensions of PCA to matrices result in sub-optimal variance retention. Moreover, when applied to SPD matrices, they ignore the geometric structure of the space of SPD matrices, further degrading the performance. In this paper we develop a new Riemannian geometry based formulation of PCA for SPD matrices that i) preserves more data variance by appropriately extending PCA to matrix data, and ii) extends the standard definition from the Euclidean to the Riemannian geometries. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals. ER -
APA
Horev, I., Yger, F. & Sugiyama, M.. (2016). Geometry-Aware Principal Component Analysis for Symmetric Positive Definite Matrices. Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 45:1-16 Available from https://proceedings.mlr.press/v45/Horev15.html.

Related Material