Robust Stochastic Principal Component Analysis

[edit]

John Goes, Teng Zhang, Raman Arora, Gilad Lerman ;
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:266-274, 2014.

Abstract

We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three novel stochastic approximation algorithms for robust PCA that are extensions of standard algorithms for PCA - the stochastic power method, incremental PCA and online PCA using matrix-exponentiated-gradient (MEG) updates. For robust online PCA we also give a sub-linear convergence guarantee. Our numerical results demonstrate the superiority of the the robust online method over the other robust stochastic methods and the advantage of robust methods over their non-robust counterparts in the presence of outliers in artificial and real scenarios.

Related Material