Multitask Principal Component Analysis

[edit]

Ikko Yamane, Florian Yger, Maxime Berar, Masashi Sugiyama ;
Proceedings of The 8th Asian Conference on Machine Learning, PMLR 63:302-317, 2016.

Abstract

Principal Component Analysis (PCA) is a canonical and well-studied tool for dimensionality reduction. However, when few data are available, the poor quality of the covariance estimator at its core may compromise its performance. We leverage this issue by casting the PCA into a multitask framework, and doing so, we show how to solve simultaneously several related PCA problems. Hence, we propose a novel formulation of the PCA problem relying on a novel regularization. This regularization is based on a distance between subspaces, and the whole problem is solved as an optimization problem over a Riemannian manifold. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals.

Related Material