Multitask Principal Component Analysis

Ikko Yamane, Florian Yger, Maxime Berar, Masashi Sugiyama
Proceedings of The 8th Asian Conference on Machine Learning, PMLR 63:302-317, 2016.

Abstract

Principal Component Analysis (PCA) is a canonical and well-studied tool for dimensionality reduction. However, when few data are available, the poor quality of the covariance estimator at its core may compromise its performance. We leverage this issue by casting the PCA into a multitask framework, and doing so, we show how to solve simultaneously several related PCA problems. Hence, we propose a novel formulation of the PCA problem relying on a novel regularization. This regularization is based on a distance between subspaces, and the whole problem is solved as an optimization problem over a Riemannian manifold. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals.

Cite this Paper


BibTeX
@InProceedings{pmlr-v63-yamane65, title = {Multitask Principal Component Analysis}, author = {Yamane, Ikko and Yger, Florian and Berar, Maxime and Sugiyama, Masashi}, booktitle = {Proceedings of The 8th Asian Conference on Machine Learning}, pages = {302--317}, year = {2016}, editor = {Durrant, Robert J. and Kim, Kee-Eung}, volume = {63}, series = {Proceedings of Machine Learning Research}, address = {The University of Waikato, Hamilton, New Zealand}, month = {16--18 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v63/yamane65.pdf}, url = {https://proceedings.mlr.press/v63/yamane65.html}, abstract = {Principal Component Analysis (PCA) is a canonical and well-studied tool for dimensionality reduction. However, when few data are available, the poor quality of the covariance estimator at its core may compromise its performance. We leverage this issue by casting the PCA into a multitask framework, and doing so, we show how to solve simultaneously several related PCA problems. Hence, we propose a novel formulation of the PCA problem relying on a novel regularization. This regularization is based on a distance between subspaces, and the whole problem is solved as an optimization problem over a Riemannian manifold. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals.} }
Endnote
%0 Conference Paper %T Multitask Principal Component Analysis %A Ikko Yamane %A Florian Yger %A Maxime Berar %A Masashi Sugiyama %B Proceedings of The 8th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Robert J. Durrant %E Kee-Eung Kim %F pmlr-v63-yamane65 %I PMLR %P 302--317 %U https://proceedings.mlr.press/v63/yamane65.html %V 63 %X Principal Component Analysis (PCA) is a canonical and well-studied tool for dimensionality reduction. However, when few data are available, the poor quality of the covariance estimator at its core may compromise its performance. We leverage this issue by casting the PCA into a multitask framework, and doing so, we show how to solve simultaneously several related PCA problems. Hence, we propose a novel formulation of the PCA problem relying on a novel regularization. This regularization is based on a distance between subspaces, and the whole problem is solved as an optimization problem over a Riemannian manifold. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals.
RIS
TY - CPAPER TI - Multitask Principal Component Analysis AU - Ikko Yamane AU - Florian Yger AU - Maxime Berar AU - Masashi Sugiyama BT - Proceedings of The 8th Asian Conference on Machine Learning DA - 2016/11/20 ED - Robert J. Durrant ED - Kee-Eung Kim ID - pmlr-v63-yamane65 PB - PMLR DP - Proceedings of Machine Learning Research VL - 63 SP - 302 EP - 317 L1 - http://proceedings.mlr.press/v63/yamane65.pdf UR - https://proceedings.mlr.press/v63/yamane65.html AB - Principal Component Analysis (PCA) is a canonical and well-studied tool for dimensionality reduction. However, when few data are available, the poor quality of the covariance estimator at its core may compromise its performance. We leverage this issue by casting the PCA into a multitask framework, and doing so, we show how to solve simultaneously several related PCA problems. Hence, we propose a novel formulation of the PCA problem relying on a novel regularization. This regularization is based on a distance between subspaces, and the whole problem is solved as an optimization problem over a Riemannian manifold. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals. ER -
APA
Yamane, I., Yger, F., Berar, M. & Sugiyama, M.. (2016). Multitask Principal Component Analysis. Proceedings of The 8th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 63:302-317 Available from https://proceedings.mlr.press/v63/yamane65.html.

Related Material