First-order Methods for Geodesically Convex Optimization

Hongyi Zhang, Suvrit Sra
29th Annual Conference on Learning Theory, PMLR 49:1617-1638, 2016.

Abstract

Geodesic convexity generalizes the notion of (vector space) convexity to nonlinear metric spaces. But unlike convex optimization, geodesically convex (g-convex) optimization is much less developed. In this paper we contribute to the understanding of g-convex optimization by developing iteration complexity analysis for several first-order algorithms on Hadamard manifolds. Specifically, we prove upper bounds for the global complexity of deterministic and stochastic (sub)gradient methods for optimizing smooth and nonsmooth g-convex functions, both with and without strong g-convexity. Our analysis also reveals how the manifold geometry, especially \emphsectional curvature, impacts convergence rates. To the best of our knowledge, our work is the first to provide global complexity analysis for first-order algorithms for general g-convex optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-zhang16b, title = {First-order Methods for Geodesically Convex Optimization}, author = {Zhang, Hongyi and Sra, Suvrit}, booktitle = {29th Annual Conference on Learning Theory}, pages = {1617--1638}, year = {2016}, editor = {Feldman, Vitaly and Rakhlin, Alexander and Shamir, Ohad}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/zhang16b.pdf}, url = {https://proceedings.mlr.press/v49/zhang16b.html}, abstract = {Geodesic convexity generalizes the notion of (vector space) convexity to nonlinear metric spaces. But unlike convex optimization, geodesically convex (g-convex) optimization is much less developed. In this paper we contribute to the understanding of g-convex optimization by developing iteration complexity analysis for several first-order algorithms on Hadamard manifolds. Specifically, we prove upper bounds for the global complexity of deterministic and stochastic (sub)gradient methods for optimizing smooth and nonsmooth g-convex functions, both with and without strong g-convexity. Our analysis also reveals how the manifold geometry, especially \emphsectional curvature, impacts convergence rates. To the best of our knowledge, our work is the first to provide global complexity analysis for first-order algorithms for general g-convex optimization.} }
Endnote
%0 Conference Paper %T First-order Methods for Geodesically Convex Optimization %A Hongyi Zhang %A Suvrit Sra %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-zhang16b %I PMLR %P 1617--1638 %U https://proceedings.mlr.press/v49/zhang16b.html %V 49 %X Geodesic convexity generalizes the notion of (vector space) convexity to nonlinear metric spaces. But unlike convex optimization, geodesically convex (g-convex) optimization is much less developed. In this paper we contribute to the understanding of g-convex optimization by developing iteration complexity analysis for several first-order algorithms on Hadamard manifolds. Specifically, we prove upper bounds for the global complexity of deterministic and stochastic (sub)gradient methods for optimizing smooth and nonsmooth g-convex functions, both with and without strong g-convexity. Our analysis also reveals how the manifold geometry, especially \emphsectional curvature, impacts convergence rates. To the best of our knowledge, our work is the first to provide global complexity analysis for first-order algorithms for general g-convex optimization.
RIS
TY - CPAPER TI - First-order Methods for Geodesically Convex Optimization AU - Hongyi Zhang AU - Suvrit Sra BT - 29th Annual Conference on Learning Theory DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-zhang16b PB - PMLR DP - Proceedings of Machine Learning Research VL - 49 SP - 1617 EP - 1638 L1 - http://proceedings.mlr.press/v49/zhang16b.pdf UR - https://proceedings.mlr.press/v49/zhang16b.html AB - Geodesic convexity generalizes the notion of (vector space) convexity to nonlinear metric spaces. But unlike convex optimization, geodesically convex (g-convex) optimization is much less developed. In this paper we contribute to the understanding of g-convex optimization by developing iteration complexity analysis for several first-order algorithms on Hadamard manifolds. Specifically, we prove upper bounds for the global complexity of deterministic and stochastic (sub)gradient methods for optimizing smooth and nonsmooth g-convex functions, both with and without strong g-convexity. Our analysis also reveals how the manifold geometry, especially \emphsectional curvature, impacts convergence rates. To the best of our knowledge, our work is the first to provide global complexity analysis for first-order algorithms for general g-convex optimization. ER -
APA
Zhang, H. & Sra, S.. (2016). First-order Methods for Geodesically Convex Optimization. 29th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 49:1617-1638 Available from https://proceedings.mlr.press/v49/zhang16b.html.

Related Material