Convergent Working Set Algorithm for Lasso with Non-Convex Sparse Regularizers

Alain Rakotomamonjy, Rémi Flamary, Joseph Salmon, Gilles Gasso
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:5196-5211, 2022.

Abstract

Non-convex sparse regularizers are common tools for learning with high-dimensional data. For accelerating convergence for Lasso problem involving those regularizers, a working set strategy addresses the optimization problem through an iterative algorithm by gradually incrementing the number of variables to optimize until the identification of the solution support. We propose in this paper the first Lasso working set algorithm for non-convex sparse regularizers with convergence guarantees. The algorithm, named FireWorks, is based on a non-convex reformulation of a recent duality-based approach and leverages on the geometry of the residuals. We provide theoretical guarantees showing that convergence is preserved even when the inner solver is inexact, under sufficient decay of the error across iterations. Experimental results demonstrate strong computational gain when using our working set strategy compared to full problem solvers for both block-coordinate descent or a proximal gradient solver.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-rakotomamonjy22a, title = { Convergent Working Set Algorithm for Lasso with Non-Convex Sparse Regularizers }, author = {Rakotomamonjy, Alain and Flamary, R\'emi and Salmon, Joseph and Gasso, Gilles}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {5196--5211}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/rakotomamonjy22a/rakotomamonjy22a.pdf}, url = {https://proceedings.mlr.press/v151/rakotomamonjy22a.html}, abstract = { Non-convex sparse regularizers are common tools for learning with high-dimensional data. For accelerating convergence for Lasso problem involving those regularizers, a working set strategy addresses the optimization problem through an iterative algorithm by gradually incrementing the number of variables to optimize until the identification of the solution support. We propose in this paper the first Lasso working set algorithm for non-convex sparse regularizers with convergence guarantees. The algorithm, named FireWorks, is based on a non-convex reformulation of a recent duality-based approach and leverages on the geometry of the residuals. We provide theoretical guarantees showing that convergence is preserved even when the inner solver is inexact, under sufficient decay of the error across iterations. Experimental results demonstrate strong computational gain when using our working set strategy compared to full problem solvers for both block-coordinate descent or a proximal gradient solver. } }
Endnote
%0 Conference Paper %T Convergent Working Set Algorithm for Lasso with Non-Convex Sparse Regularizers %A Alain Rakotomamonjy %A Rémi Flamary %A Joseph Salmon %A Gilles Gasso %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-rakotomamonjy22a %I PMLR %P 5196--5211 %U https://proceedings.mlr.press/v151/rakotomamonjy22a.html %V 151 %X Non-convex sparse regularizers are common tools for learning with high-dimensional data. For accelerating convergence for Lasso problem involving those regularizers, a working set strategy addresses the optimization problem through an iterative algorithm by gradually incrementing the number of variables to optimize until the identification of the solution support. We propose in this paper the first Lasso working set algorithm for non-convex sparse regularizers with convergence guarantees. The algorithm, named FireWorks, is based on a non-convex reformulation of a recent duality-based approach and leverages on the geometry of the residuals. We provide theoretical guarantees showing that convergence is preserved even when the inner solver is inexact, under sufficient decay of the error across iterations. Experimental results demonstrate strong computational gain when using our working set strategy compared to full problem solvers for both block-coordinate descent or a proximal gradient solver.
APA
Rakotomamonjy, A., Flamary, R., Salmon, J. & Gasso, G.. (2022). Convergent Working Set Algorithm for Lasso with Non-Convex Sparse Regularizers . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:5196-5211 Available from https://proceedings.mlr.press/v151/rakotomamonjy22a.html.

Related Material