Contractivity and linear convergence in bilinear saddle-point problems: An operator-theoretic approach

Colin Dirren, Mattia Bianchi, Panagiotis D. Grontas, John Lygeros, Florian Dorfler
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1954-1962, 2025.

Abstract

We study the convex-concave bilinear saddle-point problem $\min_x \max_y f(x) + y^\top Ax - g(y)$, where both, only one, or none of the functions $f$ and $g$ are strongly convex, and suitable rank conditions on the matrix $A$ hold. The solution of this problem is at the core of many machine learning tasks. By employing tools from monotone operator theory, we systematically prove the contractivity (in turn, the linear convergence) of several first-order primal-dual algorithms, including the Chambolle–Pock method. Our approach results in concise proofs, and it yields new convergence guarantees and tighter bounds compared to known results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-dirren25a, title = {Contractivity and linear convergence in bilinear saddle-point problems: An operator-theoretic approach}, author = {Dirren, Colin and Bianchi, Mattia and Grontas, Panagiotis D. and Lygeros, John and Dorfler, Florian}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1954--1962}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/dirren25a/dirren25a.pdf}, url = {https://proceedings.mlr.press/v258/dirren25a.html}, abstract = {We study the convex-concave bilinear saddle-point problem $\min_x \max_y f(x) + y^\top Ax - g(y)$, where both, only one, or none of the functions $f$ and $g$ are strongly convex, and suitable rank conditions on the matrix $A$ hold. The solution of this problem is at the core of many machine learning tasks. By employing tools from monotone operator theory, we systematically prove the contractivity (in turn, the linear convergence) of several first-order primal-dual algorithms, including the Chambolle–Pock method. Our approach results in concise proofs, and it yields new convergence guarantees and tighter bounds compared to known results.} }
Endnote
%0 Conference Paper %T Contractivity and linear convergence in bilinear saddle-point problems: An operator-theoretic approach %A Colin Dirren %A Mattia Bianchi %A Panagiotis D. Grontas %A John Lygeros %A Florian Dorfler %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-dirren25a %I PMLR %P 1954--1962 %U https://proceedings.mlr.press/v258/dirren25a.html %V 258 %X We study the convex-concave bilinear saddle-point problem $\min_x \max_y f(x) + y^\top Ax - g(y)$, where both, only one, or none of the functions $f$ and $g$ are strongly convex, and suitable rank conditions on the matrix $A$ hold. The solution of this problem is at the core of many machine learning tasks. By employing tools from monotone operator theory, we systematically prove the contractivity (in turn, the linear convergence) of several first-order primal-dual algorithms, including the Chambolle–Pock method. Our approach results in concise proofs, and it yields new convergence guarantees and tighter bounds compared to known results.
APA
Dirren, C., Bianchi, M., Grontas, P.D., Lygeros, J. & Dorfler, F.. (2025). Contractivity and linear convergence in bilinear saddle-point problems: An operator-theoretic approach. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1954-1962 Available from https://proceedings.mlr.press/v258/dirren25a.html.

Related Material