Robust Stochastic Principal Component Analysis

John Goes, Teng Zhang, Raman Arora, Gilad Lerman
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:266-274, 2014.

Abstract

We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three novel stochastic approximation algorithms for robust PCA that are extensions of standard algorithms for PCA - the stochastic power method, incremental PCA and online PCA using matrix-exponentiated-gradient (MEG) updates. For robust online PCA we also give a sub-linear convergence guarantee. Our numerical results demonstrate the superiority of the the robust online method over the other robust stochastic methods and the advantage of robust methods over their non-robust counterparts in the presence of outliers in artificial and real scenarios.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-goes14, title = {{Robust Stochastic Principal Component Analysis}}, author = {Goes, John and Zhang, Teng and Arora, Raman and Lerman, Gilad}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {266--274}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/goes14.pdf}, url = {https://proceedings.mlr.press/v33/goes14.html}, abstract = {We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three novel stochastic approximation algorithms for robust PCA that are extensions of standard algorithms for PCA - the stochastic power method, incremental PCA and online PCA using matrix-exponentiated-gradient (MEG) updates. For robust online PCA we also give a sub-linear convergence guarantee. Our numerical results demonstrate the superiority of the the robust online method over the other robust stochastic methods and the advantage of robust methods over their non-robust counterparts in the presence of outliers in artificial and real scenarios.} }
Endnote
%0 Conference Paper %T Robust Stochastic Principal Component Analysis %A John Goes %A Teng Zhang %A Raman Arora %A Gilad Lerman %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-goes14 %I PMLR %P 266--274 %U https://proceedings.mlr.press/v33/goes14.html %V 33 %X We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three novel stochastic approximation algorithms for robust PCA that are extensions of standard algorithms for PCA - the stochastic power method, incremental PCA and online PCA using matrix-exponentiated-gradient (MEG) updates. For robust online PCA we also give a sub-linear convergence guarantee. Our numerical results demonstrate the superiority of the the robust online method over the other robust stochastic methods and the advantage of robust methods over their non-robust counterparts in the presence of outliers in artificial and real scenarios.
RIS
TY - CPAPER TI - Robust Stochastic Principal Component Analysis AU - John Goes AU - Teng Zhang AU - Raman Arora AU - Gilad Lerman BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-goes14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 266 EP - 274 L1 - http://proceedings.mlr.press/v33/goes14.pdf UR - https://proceedings.mlr.press/v33/goes14.html AB - We consider the problem of finding lower dimensional subspaces in the presence of outliers and noise in the online setting. In particular, we extend previous batch formulations of robust PCA to the stochastic setting with minimal storage requirements and runtime complexity. We introduce three novel stochastic approximation algorithms for robust PCA that are extensions of standard algorithms for PCA - the stochastic power method, incremental PCA and online PCA using matrix-exponentiated-gradient (MEG) updates. For robust online PCA we also give a sub-linear convergence guarantee. Our numerical results demonstrate the superiority of the the robust online method over the other robust stochastic methods and the advantage of robust methods over their non-robust counterparts in the presence of outliers in artificial and real scenarios. ER -
APA
Goes, J., Zhang, T., Arora, R. & Lerman, G.. (2014). Robust Stochastic Principal Component Analysis. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:266-274 Available from https://proceedings.mlr.press/v33/goes14.html.

Related Material