Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection

Julie Nutini, Mark Schmidt, Issam Laradji, Michael Friedlander, Hoyt Koepke
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1632-1641, 2015.

Abstract

There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of  Nesterov [SIAM J. Optim., 22(2), 2012], who showed that a random-coordinate selection rule achieves the same convergence rate as the Gauss-Southwell selection rule. This result suggests that we should never use the Gauss-Southwell rule, as it is typically much more expensive than random selection. However, the empirical behaviours of these algorithms contradict this theoretical result: in applications where the computational costs of the selection rules are comparable, the Gauss-Southwell selection rule tends to perform substantially better than random coordinate selection. We give a simple analysis of the Gauss-Southwell rule showing that—except in extreme cases—it’s convergence rate is faster than choosing random coordinates. Further, in this work we (i) show that exact coordinate optimization improves the convergence rate for certain sparse problems, (ii) propose a Gauss-Southwell-Lipschitz rule that gives an even faster convergence rate given knowledge of the Lipschitz constants of the partial derivatives, (iii) analyze the effect of approximate Gauss-Southwell rules, and (iv) analyze proximal-gradient variants of the Gauss-Southwell rule.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-nutini15, title = {Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection}, author = {Nutini, Julie and Schmidt, Mark and Laradji, Issam and Friedlander, Michael and Koepke, Hoyt}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1632--1641}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/nutini15.pdf}, url = {https://proceedings.mlr.press/v37/nutini15.html}, abstract = {There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of  Nesterov [SIAM J. Optim., 22(2), 2012], who showed that a random-coordinate selection rule achieves the same convergence rate as the Gauss-Southwell selection rule. This result suggests that we should never use the Gauss-Southwell rule, as it is typically much more expensive than random selection. However, the empirical behaviours of these algorithms contradict this theoretical result: in applications where the computational costs of the selection rules are comparable, the Gauss-Southwell selection rule tends to perform substantially better than random coordinate selection. We give a simple analysis of the Gauss-Southwell rule showing that—except in extreme cases—it’s convergence rate is faster than choosing random coordinates. Further, in this work we (i) show that exact coordinate optimization improves the convergence rate for certain sparse problems, (ii) propose a Gauss-Southwell-Lipschitz rule that gives an even faster convergence rate given knowledge of the Lipschitz constants of the partial derivatives, (iii) analyze the effect of approximate Gauss-Southwell rules, and (iv) analyze proximal-gradient variants of the Gauss-Southwell rule.} }
Endnote
%0 Conference Paper %T Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection %A Julie Nutini %A Mark Schmidt %A Issam Laradji %A Michael Friedlander %A Hoyt Koepke %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-nutini15 %I PMLR %P 1632--1641 %U https://proceedings.mlr.press/v37/nutini15.html %V 37 %X There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of  Nesterov [SIAM J. Optim., 22(2), 2012], who showed that a random-coordinate selection rule achieves the same convergence rate as the Gauss-Southwell selection rule. This result suggests that we should never use the Gauss-Southwell rule, as it is typically much more expensive than random selection. However, the empirical behaviours of these algorithms contradict this theoretical result: in applications where the computational costs of the selection rules are comparable, the Gauss-Southwell selection rule tends to perform substantially better than random coordinate selection. We give a simple analysis of the Gauss-Southwell rule showing that—except in extreme cases—it’s convergence rate is faster than choosing random coordinates. Further, in this work we (i) show that exact coordinate optimization improves the convergence rate for certain sparse problems, (ii) propose a Gauss-Southwell-Lipschitz rule that gives an even faster convergence rate given knowledge of the Lipschitz constants of the partial derivatives, (iii) analyze the effect of approximate Gauss-Southwell rules, and (iv) analyze proximal-gradient variants of the Gauss-Southwell rule.
RIS
TY - CPAPER TI - Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection AU - Julie Nutini AU - Mark Schmidt AU - Issam Laradji AU - Michael Friedlander AU - Hoyt Koepke BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-nutini15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1632 EP - 1641 L1 - http://proceedings.mlr.press/v37/nutini15.pdf UR - https://proceedings.mlr.press/v37/nutini15.html AB - There has been significant recent work on the theory and application of randomized coordinate descent algorithms, beginning with the work of  Nesterov [SIAM J. Optim., 22(2), 2012], who showed that a random-coordinate selection rule achieves the same convergence rate as the Gauss-Southwell selection rule. This result suggests that we should never use the Gauss-Southwell rule, as it is typically much more expensive than random selection. However, the empirical behaviours of these algorithms contradict this theoretical result: in applications where the computational costs of the selection rules are comparable, the Gauss-Southwell selection rule tends to perform substantially better than random coordinate selection. We give a simple analysis of the Gauss-Southwell rule showing that—except in extreme cases—it’s convergence rate is faster than choosing random coordinates. Further, in this work we (i) show that exact coordinate optimization improves the convergence rate for certain sparse problems, (ii) propose a Gauss-Southwell-Lipschitz rule that gives an even faster convergence rate given knowledge of the Lipschitz constants of the partial derivatives, (iii) analyze the effect of approximate Gauss-Southwell rules, and (iv) analyze proximal-gradient variants of the Gauss-Southwell rule. ER -
APA
Nutini, J., Schmidt, M., Laradji, I., Friedlander, M. & Koepke, H.. (2015). Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1632-1641 Available from https://proceedings.mlr.press/v37/nutini15.html.

Related Material