A Characterization of List Regression

Chirag Pabbaraju, Sahasrajit Sarmasarkar
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:870-920, 2025.

Abstract

There has been a recent interest in understanding and characterizing the sample complexity of list learning tasks, where the learning algorithm is allowed to make a short list of k predictions, and we simply require one of the predictions to be correct. This includes recent works characterizing the PAC sample complexity of standard list classification and online list classification. Adding to this theme, in this work, we provide a complete characterization of list PAC {\em regression}. We propose two combinatorial dimensions, namely the k-OIG dimension and the k-fat-shattering dimension, and show that they characterize realizable and agnostic k-list regression respectively. These quantities generalize known dimensions for standard regression. Our work thus extends existing list learning characterizations from classification to regression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v272-pabbaraju25a, title = {A Characterization of List Regression}, author = {Pabbaraju, Chirag and Sarmasarkar, Sahasrajit}, booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory}, pages = {870--920}, year = {2025}, editor = {Kamath, Gautam and Loh, Po-Ling}, volume = {272}, series = {Proceedings of Machine Learning Research}, month = {24--27 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v272/main/assets/pabbaraju25a/pabbaraju25a.pdf}, url = {https://proceedings.mlr.press/v272/pabbaraju25a.html}, abstract = {There has been a recent interest in understanding and characterizing the sample complexity of list learning tasks, where the learning algorithm is allowed to make a short list of $k$ predictions, and we simply require one of the predictions to be correct. This includes recent works characterizing the PAC sample complexity of standard list classification and online list classification. Adding to this theme, in this work, we provide a complete characterization of list PAC {\em regression}. We propose two combinatorial dimensions, namely the $k$-OIG dimension and the $k$-fat-shattering dimension, and show that they characterize realizable and agnostic $k$-list regression respectively. These quantities generalize known dimensions for standard regression. Our work thus extends existing list learning characterizations from classification to regression.} }
Endnote
%0 Conference Paper %T A Characterization of List Regression %A Chirag Pabbaraju %A Sahasrajit Sarmasarkar %B Proceedings of The 36th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Gautam Kamath %E Po-Ling Loh %F pmlr-v272-pabbaraju25a %I PMLR %P 870--920 %U https://proceedings.mlr.press/v272/pabbaraju25a.html %V 272 %X There has been a recent interest in understanding and characterizing the sample complexity of list learning tasks, where the learning algorithm is allowed to make a short list of $k$ predictions, and we simply require one of the predictions to be correct. This includes recent works characterizing the PAC sample complexity of standard list classification and online list classification. Adding to this theme, in this work, we provide a complete characterization of list PAC {\em regression}. We propose two combinatorial dimensions, namely the $k$-OIG dimension and the $k$-fat-shattering dimension, and show that they characterize realizable and agnostic $k$-list regression respectively. These quantities generalize known dimensions for standard regression. Our work thus extends existing list learning characterizations from classification to regression.
APA
Pabbaraju, C. & Sarmasarkar, S.. (2025). A Characterization of List Regression. Proceedings of The 36th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 272:870-920 Available from https://proceedings.mlr.press/v272/pabbaraju25a.html.

Related Material