On Structured Filtering-Clustering: Global Error Bound and Optimal First-Order Algorithms

Nhat Ho, Tianyi Lin, Michael Jordan
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:896-921, 2022.

Abstract

The filtering-clustering models, including trend filtering and convex clustering, have become an important source of ideas and modeling tools in machine learning and related fields. The statistical guarantee of optimal solutions in these models has been extensively studied yet the investigations on the computational aspect have remained limited. In particular, practitioners often employ the first-order algorithms in real-world applications and are impressed by their superior performance regardless of ill-conditioned structures of difference operator matrices, thus leaving open the problem of understanding the convergence property of first-order algorithms. This paper settles this open problem and contributes to the broad interplay between statistics and optimization by identifying a global error bound condition, which is satisfied by a large class of dual filtering-clustering problems, and designing a class of generalized dual gradient ascent algorithm, which is optimal first-order algorithms in deterministic, finite-sum and online settings. Our results are new and help explain why the filtering-clustering models can be efficiently solved by first-order algorithms. We also provide the detailed convergence rate analysis for the proposed algorithms in different settings, shedding light on their potential to solve the filtering-clustering models efficiently. We also conduct experiments on real datasets and the numerical results demonstrate the effectiveness of our algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-ho22a, title = { On Structured Filtering-Clustering: Global Error Bound and Optimal First-Order Algorithms }, author = {Ho, Nhat and Lin, Tianyi and Jordan, Michael}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {896--921}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/ho22a/ho22a.pdf}, url = {https://proceedings.mlr.press/v151/ho22a.html}, abstract = { The filtering-clustering models, including trend filtering and convex clustering, have become an important source of ideas and modeling tools in machine learning and related fields. The statistical guarantee of optimal solutions in these models has been extensively studied yet the investigations on the computational aspect have remained limited. In particular, practitioners often employ the first-order algorithms in real-world applications and are impressed by their superior performance regardless of ill-conditioned structures of difference operator matrices, thus leaving open the problem of understanding the convergence property of first-order algorithms. This paper settles this open problem and contributes to the broad interplay between statistics and optimization by identifying a global error bound condition, which is satisfied by a large class of dual filtering-clustering problems, and designing a class of generalized dual gradient ascent algorithm, which is optimal first-order algorithms in deterministic, finite-sum and online settings. Our results are new and help explain why the filtering-clustering models can be efficiently solved by first-order algorithms. We also provide the detailed convergence rate analysis for the proposed algorithms in different settings, shedding light on their potential to solve the filtering-clustering models efficiently. We also conduct experiments on real datasets and the numerical results demonstrate the effectiveness of our algorithms. } }
Endnote
%0 Conference Paper %T On Structured Filtering-Clustering: Global Error Bound and Optimal First-Order Algorithms %A Nhat Ho %A Tianyi Lin %A Michael Jordan %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-ho22a %I PMLR %P 896--921 %U https://proceedings.mlr.press/v151/ho22a.html %V 151 %X The filtering-clustering models, including trend filtering and convex clustering, have become an important source of ideas and modeling tools in machine learning and related fields. The statistical guarantee of optimal solutions in these models has been extensively studied yet the investigations on the computational aspect have remained limited. In particular, practitioners often employ the first-order algorithms in real-world applications and are impressed by their superior performance regardless of ill-conditioned structures of difference operator matrices, thus leaving open the problem of understanding the convergence property of first-order algorithms. This paper settles this open problem and contributes to the broad interplay between statistics and optimization by identifying a global error bound condition, which is satisfied by a large class of dual filtering-clustering problems, and designing a class of generalized dual gradient ascent algorithm, which is optimal first-order algorithms in deterministic, finite-sum and online settings. Our results are new and help explain why the filtering-clustering models can be efficiently solved by first-order algorithms. We also provide the detailed convergence rate analysis for the proposed algorithms in different settings, shedding light on their potential to solve the filtering-clustering models efficiently. We also conduct experiments on real datasets and the numerical results demonstrate the effectiveness of our algorithms.
APA
Ho, N., Lin, T. & Jordan, M.. (2022). On Structured Filtering-Clustering: Global Error Bound and Optimal First-Order Algorithms . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:896-921 Available from https://proceedings.mlr.press/v151/ho22a.html.

Related Material