Streaming PCA: Matching Matrix Bernstein and Near-Optimal Finite Sample Guarantees for Oja’s Algorithm

Prateek Jain, Chi Jin, Sham M. Kakade, Praneeth Netrapalli, Aaron Sidford
; 29th Annual Conference on Learning Theory, PMLR 49:1147-1164, 2016.

Abstract

In this paper we provide improved guarantees for streaming principal component analysis (PCA). Given A_1, \ldots, A_n∈\mathbbR^d\times d sampled independently from distributions satisfying \mathbbE[A_i] = Σfor Σ\succeq 0, we present an O(d)-space linear-time single-pass streaming algorithm for estimating the top eigenvector of Σ. The algorithm nearly matches (and in certain cases improves upon) the accuracy obtained by the standard batch method that computes top eigenvector of the empirical covariance \frac1n \sum_i ∈[n] A_i as analyzed by the matrix Bernstein inequality. Moreover, to achieve constant accuracy, our algorithm improves upon the best previous known sample complexities of streaming algorithms by either a multiplicative factor of O(d) or 1/\mathrmgap where \mathrmgap is the relative distance between the top two eigenvalues of Σ. We achieve these results through a novel analysis of the classic Oja’s algorithm, one of the oldest and perhaps, most popular algorithms for streaming PCA. We show that simply picking a random initial point w_0 and applying the natural update rule w_i + 1 = w_i + \eta_i A_i w_i suffices for suitable choice of \eta_i. We believe our result sheds light on how to efficiently perform streaming PCA both in theory and in practice and we hope that our analysis may serve as the basis for analyzing many variants and extensions of streaming PCA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-jain16, title = {Streaming PCA: Matching Matrix Bernstein and Near-Optimal Finite Sample Guarantees for Oja's Algorithm}, author = {Prateek Jain and Chi Jin and Sham M. Kakade and Praneeth Netrapalli and Aaron Sidford}, booktitle = {29th Annual Conference on Learning Theory}, pages = {1147--1164}, year = {2016}, editor = {Vitaly Feldman and Alexander Rakhlin and Ohad Shamir}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/jain16.pdf}, url = {http://proceedings.mlr.press/v49/jain16.html}, abstract = {In this paper we provide improved guarantees for streaming principal component analysis (PCA). Given A_1, \ldots, A_n∈\mathbbR^d\times d sampled independently from distributions satisfying \mathbbE[A_i] = Σfor Σ\succeq 0, we present an O(d)-space linear-time single-pass streaming algorithm for estimating the top eigenvector of Σ. The algorithm nearly matches (and in certain cases improves upon) the accuracy obtained by the standard batch method that computes top eigenvector of the empirical covariance \frac1n \sum_i ∈[n] A_i as analyzed by the matrix Bernstein inequality. Moreover, to achieve constant accuracy, our algorithm improves upon the best previous known sample complexities of streaming algorithms by either a multiplicative factor of O(d) or 1/\mathrmgap where \mathrmgap is the relative distance between the top two eigenvalues of Σ. We achieve these results through a novel analysis of the classic Oja’s algorithm, one of the oldest and perhaps, most popular algorithms for streaming PCA. We show that simply picking a random initial point w_0 and applying the natural update rule w_i + 1 = w_i + \eta_i A_i w_i suffices for suitable choice of \eta_i. We believe our result sheds light on how to efficiently perform streaming PCA both in theory and in practice and we hope that our analysis may serve as the basis for analyzing many variants and extensions of streaming PCA.} }
Endnote
%0 Conference Paper %T Streaming PCA: Matching Matrix Bernstein and Near-Optimal Finite Sample Guarantees for Oja’s Algorithm %A Prateek Jain %A Chi Jin %A Sham M. Kakade %A Praneeth Netrapalli %A Aaron Sidford %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-jain16 %I PMLR %J Proceedings of Machine Learning Research %P 1147--1164 %U http://proceedings.mlr.press %V 49 %W PMLR %X In this paper we provide improved guarantees for streaming principal component analysis (PCA). Given A_1, \ldots, A_n∈\mathbbR^d\times d sampled independently from distributions satisfying \mathbbE[A_i] = Σfor Σ\succeq 0, we present an O(d)-space linear-time single-pass streaming algorithm for estimating the top eigenvector of Σ. The algorithm nearly matches (and in certain cases improves upon) the accuracy obtained by the standard batch method that computes top eigenvector of the empirical covariance \frac1n \sum_i ∈[n] A_i as analyzed by the matrix Bernstein inequality. Moreover, to achieve constant accuracy, our algorithm improves upon the best previous known sample complexities of streaming algorithms by either a multiplicative factor of O(d) or 1/\mathrmgap where \mathrmgap is the relative distance between the top two eigenvalues of Σ. We achieve these results through a novel analysis of the classic Oja’s algorithm, one of the oldest and perhaps, most popular algorithms for streaming PCA. We show that simply picking a random initial point w_0 and applying the natural update rule w_i + 1 = w_i + \eta_i A_i w_i suffices for suitable choice of \eta_i. We believe our result sheds light on how to efficiently perform streaming PCA both in theory and in practice and we hope that our analysis may serve as the basis for analyzing many variants and extensions of streaming PCA.
RIS
TY - CPAPER TI - Streaming PCA: Matching Matrix Bernstein and Near-Optimal Finite Sample Guarantees for Oja’s Algorithm AU - Prateek Jain AU - Chi Jin AU - Sham M. Kakade AU - Praneeth Netrapalli AU - Aaron Sidford BT - 29th Annual Conference on Learning Theory PY - 2016/06/06 DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-jain16 PB - PMLR SP - 1147 DP - PMLR EP - 1164 L1 - http://proceedings.mlr.press/v49/jain16.pdf UR - http://proceedings.mlr.press/v49/jain16.html AB - In this paper we provide improved guarantees for streaming principal component analysis (PCA). Given A_1, \ldots, A_n∈\mathbbR^d\times d sampled independently from distributions satisfying \mathbbE[A_i] = Σfor Σ\succeq 0, we present an O(d)-space linear-time single-pass streaming algorithm for estimating the top eigenvector of Σ. The algorithm nearly matches (and in certain cases improves upon) the accuracy obtained by the standard batch method that computes top eigenvector of the empirical covariance \frac1n \sum_i ∈[n] A_i as analyzed by the matrix Bernstein inequality. Moreover, to achieve constant accuracy, our algorithm improves upon the best previous known sample complexities of streaming algorithms by either a multiplicative factor of O(d) or 1/\mathrmgap where \mathrmgap is the relative distance between the top two eigenvalues of Σ. We achieve these results through a novel analysis of the classic Oja’s algorithm, one of the oldest and perhaps, most popular algorithms for streaming PCA. We show that simply picking a random initial point w_0 and applying the natural update rule w_i + 1 = w_i + \eta_i A_i w_i suffices for suitable choice of \eta_i. We believe our result sheds light on how to efficiently perform streaming PCA both in theory and in practice and we hope that our analysis may serve as the basis for analyzing many variants and extensions of streaming PCA. ER -
APA
Jain, P., Jin, C., Kakade, S.M., Netrapalli, P. & Sidford, A.. (2016). Streaming PCA: Matching Matrix Bernstein and Near-Optimal Finite Sample Guarantees for Oja’s Algorithm. 29th Annual Conference on Learning Theory, in PMLR 49:1147-1164

Related Material