Random extrapolation for primal-dual coordinate descent

Ahmet Alacaoglu, Olivier Fercoq, Volkan Cevher
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:191-201, 2020.

Abstract

We introduce a randomly extrapolated primal-dual coordinate descent method that adapts to sparsity of the data matrix and the favorable structures of the objective function. Our method updates only a subset of primal and dual variables with sparse data, and it uses large step sizes with dense data, retaining the benefits of the specific methods designed for each case. In addition to adapting to sparsity, our method attains fast convergence guarantees in favorable cases \emph{without any modifications}. In particular, we prove linear convergence under metric subregularity, which applies to strongly convex-strongly concave problems and piecewise linear quadratic functions. We show almost sure convergence of the sequence and optimal sublinear convergence rates for the primal-dual gap and objective values, in the general convex-concave case. Numerical evidence demonstrates the state-of-the-art empirical performance of our method in sparse and dense settings, matching and improving the existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-alacaoglu20a, title = {Random extrapolation for primal-dual coordinate descent}, author = {Alacaoglu, Ahmet and Fercoq, Olivier and Cevher, Volkan}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {191--201}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/alacaoglu20a/alacaoglu20a.pdf}, url = {https://proceedings.mlr.press/v119/alacaoglu20a.html}, abstract = {We introduce a randomly extrapolated primal-dual coordinate descent method that adapts to sparsity of the data matrix and the favorable structures of the objective function. Our method updates only a subset of primal and dual variables with sparse data, and it uses large step sizes with dense data, retaining the benefits of the specific methods designed for each case. In addition to adapting to sparsity, our method attains fast convergence guarantees in favorable cases \emph{without any modifications}. In particular, we prove linear convergence under metric subregularity, which applies to strongly convex-strongly concave problems and piecewise linear quadratic functions. We show almost sure convergence of the sequence and optimal sublinear convergence rates for the primal-dual gap and objective values, in the general convex-concave case. Numerical evidence demonstrates the state-of-the-art empirical performance of our method in sparse and dense settings, matching and improving the existing methods.} }
Endnote
%0 Conference Paper %T Random extrapolation for primal-dual coordinate descent %A Ahmet Alacaoglu %A Olivier Fercoq %A Volkan Cevher %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-alacaoglu20a %I PMLR %P 191--201 %U https://proceedings.mlr.press/v119/alacaoglu20a.html %V 119 %X We introduce a randomly extrapolated primal-dual coordinate descent method that adapts to sparsity of the data matrix and the favorable structures of the objective function. Our method updates only a subset of primal and dual variables with sparse data, and it uses large step sizes with dense data, retaining the benefits of the specific methods designed for each case. In addition to adapting to sparsity, our method attains fast convergence guarantees in favorable cases \emph{without any modifications}. In particular, we prove linear convergence under metric subregularity, which applies to strongly convex-strongly concave problems and piecewise linear quadratic functions. We show almost sure convergence of the sequence and optimal sublinear convergence rates for the primal-dual gap and objective values, in the general convex-concave case. Numerical evidence demonstrates the state-of-the-art empirical performance of our method in sparse and dense settings, matching and improving the existing methods.
APA
Alacaoglu, A., Fercoq, O. & Cevher, V.. (2020). Random extrapolation for primal-dual coordinate descent. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:191-201 Available from https://proceedings.mlr.press/v119/alacaoglu20a.html.

Related Material