Optimal Minimization of the Sum of Three Convex Functions with a Linear Operator

Seyoon Ko, Joong-Ho Won
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:1185-1194, 2019.

Abstract

We propose a class of optimal-rate primal-dual algorithms for minimization of the sum of three convex functions with a linear operator. We first establish the optimal convergence rates for solving the saddle-point reformulation of the problem only using first-order information under deterministic and stochastic settings, respectively. We then proceed to show that the proposed algorithm class achieves these rates. The studied algorithm class does not require matrix inversion and is simple to implement. To our knowledge, this is the first work to establish and attain the optimal rates for this class of problems with minimal assumptions. Numerical experiments show that our method outperforms state-of-the-art methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-ko19a, title = {Optimal Minimization of the Sum of Three Convex Functions with a Linear Operator}, author = {Ko, Seyoon and Won, Joong-Ho}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {1185--1194}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/ko19a/ko19a.pdf}, url = {https://proceedings.mlr.press/v89/ko19a.html}, abstract = {We propose a class of optimal-rate primal-dual algorithms for minimization of the sum of three convex functions with a linear operator. We first establish the optimal convergence rates for solving the saddle-point reformulation of the problem only using first-order information under deterministic and stochastic settings, respectively. We then proceed to show that the proposed algorithm class achieves these rates. The studied algorithm class does not require matrix inversion and is simple to implement. To our knowledge, this is the first work to establish and attain the optimal rates for this class of problems with minimal assumptions. Numerical experiments show that our method outperforms state-of-the-art methods.} }
Endnote
%0 Conference Paper %T Optimal Minimization of the Sum of Three Convex Functions with a Linear Operator %A Seyoon Ko %A Joong-Ho Won %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-ko19a %I PMLR %P 1185--1194 %U https://proceedings.mlr.press/v89/ko19a.html %V 89 %X We propose a class of optimal-rate primal-dual algorithms for minimization of the sum of three convex functions with a linear operator. We first establish the optimal convergence rates for solving the saddle-point reformulation of the problem only using first-order information under deterministic and stochastic settings, respectively. We then proceed to show that the proposed algorithm class achieves these rates. The studied algorithm class does not require matrix inversion and is simple to implement. To our knowledge, this is the first work to establish and attain the optimal rates for this class of problems with minimal assumptions. Numerical experiments show that our method outperforms state-of-the-art methods.
APA
Ko, S. & Won, J.. (2019). Optimal Minimization of the Sum of Three Convex Functions with a Linear Operator. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:1185-1194 Available from https://proceedings.mlr.press/v89/ko19a.html.

Related Material