Nonnegative Sparse PCA with Provable Guarantees

[edit]

Megasthenis Asteris, Dimitris Papailiopoulos, Alexandros Dimakis ;
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1728-1736, 2014.

Abstract

We introduce a novel algorithm to compute nonnegative sparse principal components of positive semidefinite (PSD) matrices. Our algorithm comes with approximation guarantees contingent on the spectral profile of the input matrix A: the sharper the eigenvalue decay, the better the approximation quality. If the eigenvalues decay like any asymptotically vanishing function, we can approximate nonnegative sparse PCA within any accuracy εin time polynomial in the matrix size n and desired sparsity k, but not in 1/ε. Further, we obtain a data-dependent bound that is computed by executing an algorithm on a given data set. This bound is significantly tighter than a-priori bounds and can be used to show that for all tested datasets our algorithm is provably within 40%-90% from the unknown optimum. Our algorithm is combinatorial and explores a subspace defined by the leading eigenvectors of A. We test our scheme on several data sets, showing that it matches or outperforms the previous state of the art.

Related Material