Accelerated Methods for Riemannian Min-Max Optimization Ensuring Bounded Geometric Penalties

David Martínez-Rubio, Christophe Roux, Christopher Criscitiello, Sebastian Pokutta
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:280-288, 2025.

Abstract

In this work, we study optimization problems of the form $\min_x \max_y f(x, y)$, where $f(x, y)$ is defined on a product Riemannian manifold $\mathcal{M} \times \mathcal{N}$ and is $\mu_x$-strongly geodesically convex (g-convex) in $x$ and $\mu_y$-strongly g-concave in $y$, for $\mu_x, \mu_y \geq 0$. We design accelerated methods when $f$ is $(L_x, L_y, L_{xy})$-smooth and $\mathcal{M}$, $\mathcal{N}$ are Hadamard. To that aim we introduce new g-convex optimization results, of independent interest: we show global linear convergence for metric-projected Riemannian gradient descent and improve existing accelerated methods by reducing geometric constants. Additionally, we complete the analysis of two previous works applying to the Riemannian min-max case by removing an assumption about iterates staying in a pre-specified compact set.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-martinez-rubio25a, title = {Accelerated Methods for Riemannian Min-Max Optimization Ensuring Bounded Geometric Penalties}, author = {Mart{\'i}nez-Rubio, David and Roux, Christophe and Criscitiello, Christopher and Pokutta, Sebastian}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {280--288}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/martinez-rubio25a/martinez-rubio25a.pdf}, url = {https://proceedings.mlr.press/v258/martinez-rubio25a.html}, abstract = {In this work, we study optimization problems of the form $\min_x \max_y f(x, y)$, where $f(x, y)$ is defined on a product Riemannian manifold $\mathcal{M} \times \mathcal{N}$ and is $\mu_x$-strongly geodesically convex (g-convex) in $x$ and $\mu_y$-strongly g-concave in $y$, for $\mu_x, \mu_y \geq 0$. We design accelerated methods when $f$ is $(L_x, L_y, L_{xy})$-smooth and $\mathcal{M}$, $\mathcal{N}$ are Hadamard. To that aim we introduce new g-convex optimization results, of independent interest: we show global linear convergence for metric-projected Riemannian gradient descent and improve existing accelerated methods by reducing geometric constants. Additionally, we complete the analysis of two previous works applying to the Riemannian min-max case by removing an assumption about iterates staying in a pre-specified compact set.} }
Endnote
%0 Conference Paper %T Accelerated Methods for Riemannian Min-Max Optimization Ensuring Bounded Geometric Penalties %A David Martínez-Rubio %A Christophe Roux %A Christopher Criscitiello %A Sebastian Pokutta %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-martinez-rubio25a %I PMLR %P 280--288 %U https://proceedings.mlr.press/v258/martinez-rubio25a.html %V 258 %X In this work, we study optimization problems of the form $\min_x \max_y f(x, y)$, where $f(x, y)$ is defined on a product Riemannian manifold $\mathcal{M} \times \mathcal{N}$ and is $\mu_x$-strongly geodesically convex (g-convex) in $x$ and $\mu_y$-strongly g-concave in $y$, for $\mu_x, \mu_y \geq 0$. We design accelerated methods when $f$ is $(L_x, L_y, L_{xy})$-smooth and $\mathcal{M}$, $\mathcal{N}$ are Hadamard. To that aim we introduce new g-convex optimization results, of independent interest: we show global linear convergence for metric-projected Riemannian gradient descent and improve existing accelerated methods by reducing geometric constants. Additionally, we complete the analysis of two previous works applying to the Riemannian min-max case by removing an assumption about iterates staying in a pre-specified compact set.
APA
Martínez-Rubio, D., Roux, C., Criscitiello, C. & Pokutta, S.. (2025). Accelerated Methods for Riemannian Min-Max Optimization Ensuring Bounded Geometric Penalties. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:280-288 Available from https://proceedings.mlr.press/v258/martinez-rubio25a.html.

Related Material