A Novel Convex Gaussian Min Max Theorem for Repeated Features

David Bosch, Ashkan Panahi
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3673-3681, 2025.

Abstract

The Convex Gaussian Min-Max Theorem (CGMT) allows for the study of min-max optimization problems over bilinear Gaussian forms by instead considering an alternative optimization problem whose statistical properties are tied to that of the primary optimization. We prove a generalization of the CGMT to a family of problems in machine learning (ML) with correlated entries in the data matrix. This family includes various familiar examples of problems with shared weights or repeated features. In particular, we make use of our theorem to obtain asymptotically exact learning curves for regression with vector valued labels, regression with complex variables, and regression with convolution.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-bosch25a, title = {A Novel Convex Gaussian Min Max Theorem for Repeated Features}, author = {Bosch, David and Panahi, Ashkan}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3673--3681}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/bosch25a/bosch25a.pdf}, url = {https://proceedings.mlr.press/v258/bosch25a.html}, abstract = {The Convex Gaussian Min-Max Theorem (CGMT) allows for the study of min-max optimization problems over bilinear Gaussian forms by instead considering an alternative optimization problem whose statistical properties are tied to that of the primary optimization. We prove a generalization of the CGMT to a family of problems in machine learning (ML) with correlated entries in the data matrix. This family includes various familiar examples of problems with shared weights or repeated features. In particular, we make use of our theorem to obtain asymptotically exact learning curves for regression with vector valued labels, regression with complex variables, and regression with convolution.} }
Endnote
%0 Conference Paper %T A Novel Convex Gaussian Min Max Theorem for Repeated Features %A David Bosch %A Ashkan Panahi %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-bosch25a %I PMLR %P 3673--3681 %U https://proceedings.mlr.press/v258/bosch25a.html %V 258 %X The Convex Gaussian Min-Max Theorem (CGMT) allows for the study of min-max optimization problems over bilinear Gaussian forms by instead considering an alternative optimization problem whose statistical properties are tied to that of the primary optimization. We prove a generalization of the CGMT to a family of problems in machine learning (ML) with correlated entries in the data matrix. This family includes various familiar examples of problems with shared weights or repeated features. In particular, we make use of our theorem to obtain asymptotically exact learning curves for regression with vector valued labels, regression with complex variables, and regression with convolution.
APA
Bosch, D. & Panahi, A.. (2025). A Novel Convex Gaussian Min Max Theorem for Repeated Features. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3673-3681 Available from https://proceedings.mlr.press/v258/bosch25a.html.

Related Material