On the Approximability of Sparse PCA

Siu On Chan, Dimitris Papailliopoulos, Aviad Rubinstein
29th Annual Conference on Learning Theory, PMLR 49:623-646, 2016.

Abstract

It is well known that Sparse PCA (Sparse Principal Component Analysis) is NP-hard to solve exactly on worst-case instances. What is the complexity of solving Sparse PCA approximately? Our contributions include: \beginenumerate \item a simple and efficient algorithm that achieves an n^-1/3-approximation; \item NP-hardness of approximation to within (1-\varepsilon), for some small constant \varepsilon > 0; \item SSE-hardness of approximation to within \em any constant factor; and \item an \exp\exp\left(Ω\left(\sqrt\log \log n\right)\right) (“quasi-quasi-polynomial”) gap for the standard semidefinite program. \endenumerate

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-chan16, title = {On the Approximability of Sparse PCA}, author = {Chan, Siu On and Papailliopoulos, Dimitris and Rubinstein, Aviad}, booktitle = {29th Annual Conference on Learning Theory}, pages = {623--646}, year = {2016}, editor = {Feldman, Vitaly and Rakhlin, Alexander and Shamir, Ohad}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/chan16.pdf}, url = {https://proceedings.mlr.press/v49/chan16.html}, abstract = {It is well known that Sparse PCA (Sparse Principal Component Analysis) is NP-hard to solve exactly on worst-case instances. What is the complexity of solving Sparse PCA approximately? Our contributions include: \beginenumerate \item a simple and efficient algorithm that achieves an n^-1/3-approximation; \item NP-hardness of approximation to within (1-\varepsilon), for some small constant \varepsilon > 0; \item SSE-hardness of approximation to within \em any constant factor; and \item an \exp\exp\left(Ω\left(\sqrt\log \log n\right)\right) (“quasi-quasi-polynomial”) gap for the standard semidefinite program. \endenumerate} }
Endnote
%0 Conference Paper %T On the Approximability of Sparse PCA %A Siu On Chan %A Dimitris Papailliopoulos %A Aviad Rubinstein %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-chan16 %I PMLR %P 623--646 %U https://proceedings.mlr.press/v49/chan16.html %V 49 %X It is well known that Sparse PCA (Sparse Principal Component Analysis) is NP-hard to solve exactly on worst-case instances. What is the complexity of solving Sparse PCA approximately? Our contributions include: \beginenumerate \item a simple and efficient algorithm that achieves an n^-1/3-approximation; \item NP-hardness of approximation to within (1-\varepsilon), for some small constant \varepsilon > 0; \item SSE-hardness of approximation to within \em any constant factor; and \item an \exp\exp\left(Ω\left(\sqrt\log \log n\right)\right) (“quasi-quasi-polynomial”) gap for the standard semidefinite program. \endenumerate
RIS
TY - CPAPER TI - On the Approximability of Sparse PCA AU - Siu On Chan AU - Dimitris Papailliopoulos AU - Aviad Rubinstein BT - 29th Annual Conference on Learning Theory DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-chan16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 49 SP - 623 EP - 646 L1 - http://proceedings.mlr.press/v49/chan16.pdf UR - https://proceedings.mlr.press/v49/chan16.html AB - It is well known that Sparse PCA (Sparse Principal Component Analysis) is NP-hard to solve exactly on worst-case instances. What is the complexity of solving Sparse PCA approximately? Our contributions include: \beginenumerate \item a simple and efficient algorithm that achieves an n^-1/3-approximation; \item NP-hardness of approximation to within (1-\varepsilon), for some small constant \varepsilon > 0; \item SSE-hardness of approximation to within \em any constant factor; and \item an \exp\exp\left(Ω\left(\sqrt\log \log n\right)\right) (“quasi-quasi-polynomial”) gap for the standard semidefinite program. \endenumerate ER -
APA
Chan, S.O., Papailliopoulos, D. & Rubinstein, A.. (2016). On the Approximability of Sparse PCA. 29th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 49:623-646 Available from https://proceedings.mlr.press/v49/chan16.html.

Related Material