Fixed-Parameter and Approximation Algorithms for PCA with Outliers

Yogesh Dahiya, Fedor Fomin, Fahad Panolan, Kirill Simonov
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:2341-2351, 2021.

Abstract

PCA with Outliers is the fundamental problem of identifying an underlying low-dimensional subspace in a data set corrupted with outliers. A large body of work is devoted to the information-theoretic aspects of this problem. However, from the computational perspective, its complexity is still not well-understood. We study this problem from the perspective of parameterized complexity by investigating how parameters like the dimension of the data, the subspace dimension, the number of outliers and their structure, and approximation error, influence the computational complexity of the problem. Our algorithmic methods are based on techniques of randomized linear algebra and algebraic geometry.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-dahiya21b, title = {Fixed-Parameter and Approximation Algorithms for PCA with Outliers}, author = {Dahiya, Yogesh and Fomin, Fedor and Panolan, Fahad and Simonov, Kirill}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {2341--2351}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/dahiya21b/dahiya21b.pdf}, url = {https://proceedings.mlr.press/v139/dahiya21b.html}, abstract = {PCA with Outliers is the fundamental problem of identifying an underlying low-dimensional subspace in a data set corrupted with outliers. A large body of work is devoted to the information-theoretic aspects of this problem. However, from the computational perspective, its complexity is still not well-understood. We study this problem from the perspective of parameterized complexity by investigating how parameters like the dimension of the data, the subspace dimension, the number of outliers and their structure, and approximation error, influence the computational complexity of the problem. Our algorithmic methods are based on techniques of randomized linear algebra and algebraic geometry.} }
Endnote
%0 Conference Paper %T Fixed-Parameter and Approximation Algorithms for PCA with Outliers %A Yogesh Dahiya %A Fedor Fomin %A Fahad Panolan %A Kirill Simonov %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-dahiya21b %I PMLR %P 2341--2351 %U https://proceedings.mlr.press/v139/dahiya21b.html %V 139 %X PCA with Outliers is the fundamental problem of identifying an underlying low-dimensional subspace in a data set corrupted with outliers. A large body of work is devoted to the information-theoretic aspects of this problem. However, from the computational perspective, its complexity is still not well-understood. We study this problem from the perspective of parameterized complexity by investigating how parameters like the dimension of the data, the subspace dimension, the number of outliers and their structure, and approximation error, influence the computational complexity of the problem. Our algorithmic methods are based on techniques of randomized linear algebra and algebraic geometry.
APA
Dahiya, Y., Fomin, F., Panolan, F. & Simonov, K.. (2021). Fixed-Parameter and Approximation Algorithms for PCA with Outliers. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:2341-2351 Available from https://proceedings.mlr.press/v139/dahiya21b.html.

Related Material