Faster Stochastic Alternating Direction Method of Multipliers for Nonconvex Optimization

Feihu Huang, Songcan Chen, Heng Huang
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:2839-2848, 2019.

Abstract

In this paper, we propose a faster stochastic alternating direction method of multipliers (ADMM) for nonconvex optimization by using a new stochastic path-integrated differential estimator (SPIDER), called as SPIDER-ADMM. Moreover, we prove that the SPIDER-ADMM achieves a record-breaking incremental first-order oracle (IFO) complexity for finding an $\epsilon$-approximate solution. As one of major contribution of this paper, we provide a new theoretical analysis framework for nonconvex stochastic ADMM methods with providing the optimal IFO complexity. Based on this new analysis framework, we study the unsolved optimal IFO complexity of the existing non-convex SVRG-ADMM and SAGA-ADMM methods, and prove their the optimal IFO complexity. Thus, the SPIDER-ADMM improves the existing stochastic ADMM methods. Moreover, we extend SPIDER-ADMM to the online setting, and propose a faster online SPIDER-ADMM. Our theoretical analysis also derives the IFO complexity of the online SPIDER-ADMM. Finally, the experimental results on benchmark datasets validate that the proposed algorithms have faster convergence rate than the existing ADMM algorithms for nonconvex optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-huang19a, title = {Faster Stochastic Alternating Direction Method of Multipliers for Nonconvex Optimization}, author = {Huang, Feihu and Chen, Songcan and Huang, Heng}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {2839--2848}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/huang19a/huang19a.pdf}, url = {https://proceedings.mlr.press/v97/huang19a.html}, abstract = {In this paper, we propose a faster stochastic alternating direction method of multipliers (ADMM) for nonconvex optimization by using a new stochastic path-integrated differential estimator (SPIDER), called as SPIDER-ADMM. Moreover, we prove that the SPIDER-ADMM achieves a record-breaking incremental first-order oracle (IFO) complexity for finding an $\epsilon$-approximate solution. As one of major contribution of this paper, we provide a new theoretical analysis framework for nonconvex stochastic ADMM methods with providing the optimal IFO complexity. Based on this new analysis framework, we study the unsolved optimal IFO complexity of the existing non-convex SVRG-ADMM and SAGA-ADMM methods, and prove their the optimal IFO complexity. Thus, the SPIDER-ADMM improves the existing stochastic ADMM methods. Moreover, we extend SPIDER-ADMM to the online setting, and propose a faster online SPIDER-ADMM. Our theoretical analysis also derives the IFO complexity of the online SPIDER-ADMM. Finally, the experimental results on benchmark datasets validate that the proposed algorithms have faster convergence rate than the existing ADMM algorithms for nonconvex optimization.} }
Endnote
%0 Conference Paper %T Faster Stochastic Alternating Direction Method of Multipliers for Nonconvex Optimization %A Feihu Huang %A Songcan Chen %A Heng Huang %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-huang19a %I PMLR %P 2839--2848 %U https://proceedings.mlr.press/v97/huang19a.html %V 97 %X In this paper, we propose a faster stochastic alternating direction method of multipliers (ADMM) for nonconvex optimization by using a new stochastic path-integrated differential estimator (SPIDER), called as SPIDER-ADMM. Moreover, we prove that the SPIDER-ADMM achieves a record-breaking incremental first-order oracle (IFO) complexity for finding an $\epsilon$-approximate solution. As one of major contribution of this paper, we provide a new theoretical analysis framework for nonconvex stochastic ADMM methods with providing the optimal IFO complexity. Based on this new analysis framework, we study the unsolved optimal IFO complexity of the existing non-convex SVRG-ADMM and SAGA-ADMM methods, and prove their the optimal IFO complexity. Thus, the SPIDER-ADMM improves the existing stochastic ADMM methods. Moreover, we extend SPIDER-ADMM to the online setting, and propose a faster online SPIDER-ADMM. Our theoretical analysis also derives the IFO complexity of the online SPIDER-ADMM. Finally, the experimental results on benchmark datasets validate that the proposed algorithms have faster convergence rate than the existing ADMM algorithms for nonconvex optimization.
APA
Huang, F., Chen, S. & Huang, H.. (2019). Faster Stochastic Alternating Direction Method of Multipliers for Nonconvex Optimization. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:2839-2848 Available from https://proceedings.mlr.press/v97/huang19a.html.

Related Material