An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization

Xingguo Li, Tuo Zhao, Raman Arora, Han Liu, Mingyi Hong
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:491-499, 2016.

Abstract

The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD-type methods attain iteration complexity of O(p⋅\log(1/ε)), where εis a pre-specified accuracy of the objective value, and p is the number of blocks. However, such iteration complexity explicitly depends on p, and therefore is at least p times worse than those of gradient descent methods. To bridge this theoretical gap, we propose an improved convergence analysis for the CBCD-type methods. In particular, we first show that for a family of quadratic minimization problems, the iteration complexity of the CBCD-type methods matches that of the GD methods in term of dependency on p (up to a \log^2 p factor). Thus our complexity bounds are sharper than the existing bounds by at least a factor of p/\log^2p. We also provide a lower bound to confirm that our improved complexity bounds are tight (up to a \log^2 p factor) if the largest and smallest eigenvalues of the Hessian matrix do not scale with p. Finally, we generalize our analysis to other strongly convex minimization problems beyond quadratic ones.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-li16c, title = {An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization}, author = {Li, Xingguo and Zhao, Tuo and Arora, Raman and Liu, Han and Hong, Mingyi}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {491--499}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/li16c.pdf}, url = {https://proceedings.mlr.press/v51/li16c.html}, abstract = {The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD-type methods attain iteration complexity of O(p⋅\log(1/ε)), where εis a pre-specified accuracy of the objective value, and p is the number of blocks. However, such iteration complexity explicitly depends on p, and therefore is at least p times worse than those of gradient descent methods. To bridge this theoretical gap, we propose an improved convergence analysis for the CBCD-type methods. In particular, we first show that for a family of quadratic minimization problems, the iteration complexity of the CBCD-type methods matches that of the GD methods in term of dependency on p (up to a \log^2 p factor). Thus our complexity bounds are sharper than the existing bounds by at least a factor of p/\log^2p. We also provide a lower bound to confirm that our improved complexity bounds are tight (up to a \log^2 p factor) if the largest and smallest eigenvalues of the Hessian matrix do not scale with p. Finally, we generalize our analysis to other strongly convex minimization problems beyond quadratic ones.} }
Endnote
%0 Conference Paper %T An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization %A Xingguo Li %A Tuo Zhao %A Raman Arora %A Han Liu %A Mingyi Hong %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-li16c %I PMLR %P 491--499 %U https://proceedings.mlr.press/v51/li16c.html %V 51 %X The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD-type methods attain iteration complexity of O(p⋅\log(1/ε)), where εis a pre-specified accuracy of the objective value, and p is the number of blocks. However, such iteration complexity explicitly depends on p, and therefore is at least p times worse than those of gradient descent methods. To bridge this theoretical gap, we propose an improved convergence analysis for the CBCD-type methods. In particular, we first show that for a family of quadratic minimization problems, the iteration complexity of the CBCD-type methods matches that of the GD methods in term of dependency on p (up to a \log^2 p factor). Thus our complexity bounds are sharper than the existing bounds by at least a factor of p/\log^2p. We also provide a lower bound to confirm that our improved complexity bounds are tight (up to a \log^2 p factor) if the largest and smallest eigenvalues of the Hessian matrix do not scale with p. Finally, we generalize our analysis to other strongly convex minimization problems beyond quadratic ones.
RIS
TY - CPAPER TI - An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization AU - Xingguo Li AU - Tuo Zhao AU - Raman Arora AU - Han Liu AU - Mingyi Hong BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-li16c PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 491 EP - 499 L1 - http://proceedings.mlr.press/v51/li16c.pdf UR - https://proceedings.mlr.press/v51/li16c.html AB - The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD-type methods attain iteration complexity of O(p⋅\log(1/ε)), where εis a pre-specified accuracy of the objective value, and p is the number of blocks. However, such iteration complexity explicitly depends on p, and therefore is at least p times worse than those of gradient descent methods. To bridge this theoretical gap, we propose an improved convergence analysis for the CBCD-type methods. In particular, we first show that for a family of quadratic minimization problems, the iteration complexity of the CBCD-type methods matches that of the GD methods in term of dependency on p (up to a \log^2 p factor). Thus our complexity bounds are sharper than the existing bounds by at least a factor of p/\log^2p. We also provide a lower bound to confirm that our improved complexity bounds are tight (up to a \log^2 p factor) if the largest and smallest eigenvalues of the Hessian matrix do not scale with p. Finally, we generalize our analysis to other strongly convex minimization problems beyond quadratic ones. ER -
APA
Li, X., Zhao, T., Arora, R., Liu, H. & Hong, M.. (2016). An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:491-499 Available from https://proceedings.mlr.press/v51/li16c.html.

Related Material