Dual Principal Component Pursuit for Robust Subspace Learning: Theory and Algorithms for a Holistic Approach

Tianyu Ding, Zhihui Zhu, Rene Vidal, Daniel P Robinson
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:2739-2748, 2021.

Abstract

The Dual Principal Component Pursuit (DPCP) method has been proposed to robustly recover a subspace of high-relative dimension from corrupted data. Existing analyses and algorithms of DPCP, however, mainly focus on finding a normal to a single hyperplane that contains the inliers. Although these algorithms can be extended to a subspace of higher co-dimension through a recursive approach that sequentially finds a new basis element of the space orthogonal to the subspace, this procedure is computationally expensive and lacks convergence guarantees. In this paper, we consider a DPCP approach for simultaneously computing the entire basis of the orthogonal complement subspace (we call this a holistic approach) by solving a non-convex non-smooth optimization problem over the Grassmannian. We provide geometric and statistical analyses for the global optimality and prove that it can tolerate as many outliers as the square of the number of inliers, under both noiseless and noisy settings. We then present a Riemannian regularity condition for the problem, which is then used to prove that a Riemannian subgradient method converges linearly to a neighborhood of the orthogonal subspace with error proportional to the noise level.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-ding21b, title = {Dual Principal Component Pursuit for Robust Subspace Learning: Theory and Algorithms for a Holistic Approach}, author = {Ding, Tianyu and Zhu, Zhihui and Vidal, Rene and Robinson, Daniel P}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {2739--2748}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/ding21b/ding21b.pdf}, url = {http://proceedings.mlr.press/v139/ding21b.html}, abstract = {The Dual Principal Component Pursuit (DPCP) method has been proposed to robustly recover a subspace of high-relative dimension from corrupted data. Existing analyses and algorithms of DPCP, however, mainly focus on finding a normal to a single hyperplane that contains the inliers. Although these algorithms can be extended to a subspace of higher co-dimension through a recursive approach that sequentially finds a new basis element of the space orthogonal to the subspace, this procedure is computationally expensive and lacks convergence guarantees. In this paper, we consider a DPCP approach for simultaneously computing the entire basis of the orthogonal complement subspace (we call this a holistic approach) by solving a non-convex non-smooth optimization problem over the Grassmannian. We provide geometric and statistical analyses for the global optimality and prove that it can tolerate as many outliers as the square of the number of inliers, under both noiseless and noisy settings. We then present a Riemannian regularity condition for the problem, which is then used to prove that a Riemannian subgradient method converges linearly to a neighborhood of the orthogonal subspace with error proportional to the noise level.} }
Endnote
%0 Conference Paper %T Dual Principal Component Pursuit for Robust Subspace Learning: Theory and Algorithms for a Holistic Approach %A Tianyu Ding %A Zhihui Zhu %A Rene Vidal %A Daniel P Robinson %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-ding21b %I PMLR %P 2739--2748 %U http://proceedings.mlr.press/v139/ding21b.html %V 139 %X The Dual Principal Component Pursuit (DPCP) method has been proposed to robustly recover a subspace of high-relative dimension from corrupted data. Existing analyses and algorithms of DPCP, however, mainly focus on finding a normal to a single hyperplane that contains the inliers. Although these algorithms can be extended to a subspace of higher co-dimension through a recursive approach that sequentially finds a new basis element of the space orthogonal to the subspace, this procedure is computationally expensive and lacks convergence guarantees. In this paper, we consider a DPCP approach for simultaneously computing the entire basis of the orthogonal complement subspace (we call this a holistic approach) by solving a non-convex non-smooth optimization problem over the Grassmannian. We provide geometric and statistical analyses for the global optimality and prove that it can tolerate as many outliers as the square of the number of inliers, under both noiseless and noisy settings. We then present a Riemannian regularity condition for the problem, which is then used to prove that a Riemannian subgradient method converges linearly to a neighborhood of the orthogonal subspace with error proportional to the noise level.
APA
Ding, T., Zhu, Z., Vidal, R. & Robinson, D.P.. (2021). Dual Principal Component Pursuit for Robust Subspace Learning: Theory and Algorithms for a Holistic Approach. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:2739-2748 Available from http://proceedings.mlr.press/v139/ding21b.html.

Related Material