A Nonconvex Proximal Splitting Algorithm under Moreau-Yosida Regularization

Emanuel Laude, Tao Wu, Daniel Cremers
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:491-499, 2018.

Abstract

We tackle highly nonconvex, nonsmooth composite optimization problems whose objectives comprise a Moreau-Yosida regularized term. Classical nonconvex proximal splitting algorithms, such as nonconvex ADMM, suffer from lack of convergence for such a problem class. To overcome this difficulty, in this work we consider a lifted variant of the Moreau-Yosida regularized model and propose a novel multiblock primal-dual algorithm that intrinsically stabilizes the dual block. We provide a complete convergence analysis of our algorithm and identify respective optimality qualifications under which stationarity of the original model is retrieved at convergence. Numerically, we demonstrate the relevance of Moreau-Yosida regularized models and the efficiency of our algorithm on robust regression as well as joint feature selection and semi-supervised learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-laude18a, title = {A Nonconvex Proximal Splitting Algorithm under Moreau-Yosida Regularization}, author = {Laude, Emanuel and Wu, Tao and Cremers, Daniel}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {491--499}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/laude18a/laude18a.pdf}, url = {https://proceedings.mlr.press/v84/laude18a.html}, abstract = {We tackle highly nonconvex, nonsmooth composite optimization problems whose objectives comprise a Moreau-Yosida regularized term. Classical nonconvex proximal splitting algorithms, such as nonconvex ADMM, suffer from lack of convergence for such a problem class. To overcome this difficulty, in this work we consider a lifted variant of the Moreau-Yosida regularized model and propose a novel multiblock primal-dual algorithm that intrinsically stabilizes the dual block. We provide a complete convergence analysis of our algorithm and identify respective optimality qualifications under which stationarity of the original model is retrieved at convergence. Numerically, we demonstrate the relevance of Moreau-Yosida regularized models and the efficiency of our algorithm on robust regression as well as joint feature selection and semi-supervised learning.} }
Endnote
%0 Conference Paper %T A Nonconvex Proximal Splitting Algorithm under Moreau-Yosida Regularization %A Emanuel Laude %A Tao Wu %A Daniel Cremers %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-laude18a %I PMLR %P 491--499 %U https://proceedings.mlr.press/v84/laude18a.html %V 84 %X We tackle highly nonconvex, nonsmooth composite optimization problems whose objectives comprise a Moreau-Yosida regularized term. Classical nonconvex proximal splitting algorithms, such as nonconvex ADMM, suffer from lack of convergence for such a problem class. To overcome this difficulty, in this work we consider a lifted variant of the Moreau-Yosida regularized model and propose a novel multiblock primal-dual algorithm that intrinsically stabilizes the dual block. We provide a complete convergence analysis of our algorithm and identify respective optimality qualifications under which stationarity of the original model is retrieved at convergence. Numerically, we demonstrate the relevance of Moreau-Yosida regularized models and the efficiency of our algorithm on robust regression as well as joint feature selection and semi-supervised learning.
APA
Laude, E., Wu, T. & Cremers, D.. (2018). A Nonconvex Proximal Splitting Algorithm under Moreau-Yosida Regularization. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:491-499 Available from https://proceedings.mlr.press/v84/laude18a.html.

Related Material