Accelerated Riemannian Optimization: Handling Constraints with a Prox to Bound Geometric Penalties

David Martínez-Rubio, Sebastian Pokutta
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:359-393, 2023.

Abstract

We propose a globally-accelerated, first-order method for the optimization of smooth and (strongly or not) geodesically-convex functions in a wide class of Hadamard manifolds. We achieve the same convergence rates as Nesterov’s accelerated gradient descent, up to a multiplicative geometric penalty and log factors. Crucially, we can enforce our method to stay within a compact set we define. Prior fully accelerated works \emph{resort to assuming} that the iterates of their algorithms stay in some pre-specified compact set, except for two previous methods of limited applicability. For our manifolds, this solves the open question in (Kim and Yang, 2022) about obtaining global general acceleration without iterates assumptively staying in the feasible set.In our solution, we design an accelerated Riemannian inexact proximal point algorithm, which is a result that was unknown even with exact access to the proximal operator, and is of independent interest. For smooth functions, we show we can implement the prox step inexactly with first-order methods in Riemannian balls of certain diameter that is enough for global accelerated optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v195-martinez-rubio23a, title = {Accelerated Riemannian Optimization: Handling Constraints with a Prox to Bound Geometric Penalties}, author = {Mart{\'i}nez-Rubio, David and Pokutta, Sebastian}, booktitle = {Proceedings of Thirty Sixth Conference on Learning Theory}, pages = {359--393}, year = {2023}, editor = {Neu, Gergely and Rosasco, Lorenzo}, volume = {195}, series = {Proceedings of Machine Learning Research}, month = {12--15 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v195/martinez-rubio23a/martinez-rubio23a.pdf}, url = {https://proceedings.mlr.press/v195/martinez-rubio23a.html}, abstract = {We propose a globally-accelerated, first-order method for the optimization of smooth and (strongly or not) geodesically-convex functions in a wide class of Hadamard manifolds. We achieve the same convergence rates as Nesterov’s accelerated gradient descent, up to a multiplicative geometric penalty and log factors. Crucially, we can enforce our method to stay within a compact set we define. Prior fully accelerated works \emph{resort to assuming} that the iterates of their algorithms stay in some pre-specified compact set, except for two previous methods of limited applicability. For our manifolds, this solves the open question in (Kim and Yang, 2022) about obtaining global general acceleration without iterates assumptively staying in the feasible set.In our solution, we design an accelerated Riemannian inexact proximal point algorithm, which is a result that was unknown even with exact access to the proximal operator, and is of independent interest. For smooth functions, we show we can implement the prox step inexactly with first-order methods in Riemannian balls of certain diameter that is enough for global accelerated optimization.} }
Endnote
%0 Conference Paper %T Accelerated Riemannian Optimization: Handling Constraints with a Prox to Bound Geometric Penalties %A David Martínez-Rubio %A Sebastian Pokutta %B Proceedings of Thirty Sixth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Gergely Neu %E Lorenzo Rosasco %F pmlr-v195-martinez-rubio23a %I PMLR %P 359--393 %U https://proceedings.mlr.press/v195/martinez-rubio23a.html %V 195 %X We propose a globally-accelerated, first-order method for the optimization of smooth and (strongly or not) geodesically-convex functions in a wide class of Hadamard manifolds. We achieve the same convergence rates as Nesterov’s accelerated gradient descent, up to a multiplicative geometric penalty and log factors. Crucially, we can enforce our method to stay within a compact set we define. Prior fully accelerated works \emph{resort to assuming} that the iterates of their algorithms stay in some pre-specified compact set, except for two previous methods of limited applicability. For our manifolds, this solves the open question in (Kim and Yang, 2022) about obtaining global general acceleration without iterates assumptively staying in the feasible set.In our solution, we design an accelerated Riemannian inexact proximal point algorithm, which is a result that was unknown even with exact access to the proximal operator, and is of independent interest. For smooth functions, we show we can implement the prox step inexactly with first-order methods in Riemannian balls of certain diameter that is enough for global accelerated optimization.
APA
Martínez-Rubio, D. & Pokutta, S.. (2023). Accelerated Riemannian Optimization: Handling Constraints with a Prox to Bound Geometric Penalties. Proceedings of Thirty Sixth Conference on Learning Theory, in Proceedings of Machine Learning Research 195:359-393 Available from https://proceedings.mlr.press/v195/martinez-rubio23a.html.

Related Material