Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering

Peng Wang, Huikang Liu, Anthony Man-Cho So, Laura Balzano
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:22884-22918, 2022.

Abstract

The K-subspaces (KSS) method is a generalization of the K-means method for subspace clustering. In this work, we present local convergence analysis and a recovery guarantee for KSS, assuming data are generated by the semi-random union of subspaces model, where $N$ points are randomly sampled from $K \ge 2$ overlapping subspaces. We show that if the initial assignment of the KSS method lies within a neighborhood of a true clustering, it converges at a superlinear rate and finds the correct clustering within $\Theta(\log\log N)$ iterations with high probability. Moreover, we propose a thresholding inner-product based spectral method for initialization and prove that it produces a point in this neighborhood. We also present numerical results of the studied method to support our theoretical developments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-wang22r, title = {Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering}, author = {Wang, Peng and Liu, Huikang and So, Anthony Man-Cho and Balzano, Laura}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {22884--22918}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/wang22r/wang22r.pdf}, url = {https://proceedings.mlr.press/v162/wang22r.html}, abstract = {The K-subspaces (KSS) method is a generalization of the K-means method for subspace clustering. In this work, we present local convergence analysis and a recovery guarantee for KSS, assuming data are generated by the semi-random union of subspaces model, where $N$ points are randomly sampled from $K \ge 2$ overlapping subspaces. We show that if the initial assignment of the KSS method lies within a neighborhood of a true clustering, it converges at a superlinear rate and finds the correct clustering within $\Theta(\log\log N)$ iterations with high probability. Moreover, we propose a thresholding inner-product based spectral method for initialization and prove that it produces a point in this neighborhood. We also present numerical results of the studied method to support our theoretical developments.} }
Endnote
%0 Conference Paper %T Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering %A Peng Wang %A Huikang Liu %A Anthony Man-Cho So %A Laura Balzano %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-wang22r %I PMLR %P 22884--22918 %U https://proceedings.mlr.press/v162/wang22r.html %V 162 %X The K-subspaces (KSS) method is a generalization of the K-means method for subspace clustering. In this work, we present local convergence analysis and a recovery guarantee for KSS, assuming data are generated by the semi-random union of subspaces model, where $N$ points are randomly sampled from $K \ge 2$ overlapping subspaces. We show that if the initial assignment of the KSS method lies within a neighborhood of a true clustering, it converges at a superlinear rate and finds the correct clustering within $\Theta(\log\log N)$ iterations with high probability. Moreover, we propose a thresholding inner-product based spectral method for initialization and prove that it produces a point in this neighborhood. We also present numerical results of the studied method to support our theoretical developments.
APA
Wang, P., Liu, H., So, A.M. & Balzano, L.. (2022). Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:22884-22918 Available from https://proceedings.mlr.press/v162/wang22r.html.

Related Material