A Semismooth Newton Method for Fast, Generic Convex Programming

Alnur Ali, Eric Wong, J. Zico Kolter
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:70-79, 2017.

Abstract

We introduce Newton-ADMM, a method for fast conic optimization. The basic idea is to view the residuals of consecutive iterates generated by the alternating direction method of multipliers (ADMM) as a set of fixed point equations, and then use a nonsmooth Newton method to find a solution; we apply the basic idea to the Splitting Cone Solver (SCS), a state-of-the-art method for solving generic conic optimization problems. We demonstrate theoretically, by extending the theory of semismooth operators, that Newton-ADMM converges rapidly (i.e., quadratically) to a solution; empirically, Newton-ADMM is significantly faster than SCS on a number of problems. The method also has essentially no tuning parameters, generates certificates of primal or dual infeasibility, when appropriate, and can be specialized to solve specific convex problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-ali17a, title = {A Semismooth {N}ewton Method for Fast, Generic Convex Programming}, author = {Alnur Ali and Eric Wong and J. Zico Kolter}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {70--79}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/ali17a/ali17a.pdf}, url = {https://proceedings.mlr.press/v70/ali17a.html}, abstract = {We introduce Newton-ADMM, a method for fast conic optimization. The basic idea is to view the residuals of consecutive iterates generated by the alternating direction method of multipliers (ADMM) as a set of fixed point equations, and then use a nonsmooth Newton method to find a solution; we apply the basic idea to the Splitting Cone Solver (SCS), a state-of-the-art method for solving generic conic optimization problems. We demonstrate theoretically, by extending the theory of semismooth operators, that Newton-ADMM converges rapidly (i.e., quadratically) to a solution; empirically, Newton-ADMM is significantly faster than SCS on a number of problems. The method also has essentially no tuning parameters, generates certificates of primal or dual infeasibility, when appropriate, and can be specialized to solve specific convex problems.} }
Endnote
%0 Conference Paper %T A Semismooth Newton Method for Fast, Generic Convex Programming %A Alnur Ali %A Eric Wong %A J. Zico Kolter %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-ali17a %I PMLR %P 70--79 %U https://proceedings.mlr.press/v70/ali17a.html %V 70 %X We introduce Newton-ADMM, a method for fast conic optimization. The basic idea is to view the residuals of consecutive iterates generated by the alternating direction method of multipliers (ADMM) as a set of fixed point equations, and then use a nonsmooth Newton method to find a solution; we apply the basic idea to the Splitting Cone Solver (SCS), a state-of-the-art method for solving generic conic optimization problems. We demonstrate theoretically, by extending the theory of semismooth operators, that Newton-ADMM converges rapidly (i.e., quadratically) to a solution; empirically, Newton-ADMM is significantly faster than SCS on a number of problems. The method also has essentially no tuning parameters, generates certificates of primal or dual infeasibility, when appropriate, and can be specialized to solve specific convex problems.
APA
Ali, A., Wong, E. & Kolter, J.Z.. (2017). A Semismooth Newton Method for Fast, Generic Convex Programming. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:70-79 Available from https://proceedings.mlr.press/v70/ali17a.html.

Related Material