Group-realizable multi-group learning by minimizing empirical risk

Navid Ardeshir, Samuel Deng, Daniel Hsu, Jingwen Liu
Proceedings of The 37th International Conference on Algorithmic Learning Theory, PMLR 313:1-12, 2026.

Abstract

The sample complexity of multi-group learning is shown to improve in the group-realizable setting over the agnostic setting, even when the family of groups is infinite so long as it has finite VC dimension. The improved sample complexity is obtained by empirical risk minimization over the class of group-realizable concepts, which itself could have infinite VC dimension. Implementing this approach is also shown to be computationally intractable, and an alternative approach is suggested based on improper learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v313-ardeshir26a, title = {Group-realizable multi-group learning by minimizing empirical risk}, author = {Ardeshir, Navid and Deng, Samuel and Hsu, Daniel and Liu, Jingwen}, booktitle = {Proceedings of The 37th International Conference on Algorithmic Learning Theory}, pages = {1--12}, year = {2026}, editor = {Telgarsky, Matus and Ullman, Jonathan}, volume = {313}, series = {Proceedings of Machine Learning Research}, month = {23--26 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v313/main/assets/ardeshir26a/ardeshir26a.pdf}, url = {https://proceedings.mlr.press/v313/ardeshir26a.html}, abstract = {The sample complexity of multi-group learning is shown to improve in the group-realizable setting over the agnostic setting, even when the family of groups is infinite so long as it has finite VC dimension. The improved sample complexity is obtained by empirical risk minimization over the class of group-realizable concepts, which itself could have infinite VC dimension. Implementing this approach is also shown to be computationally intractable, and an alternative approach is suggested based on improper learning.} }
Endnote
%0 Conference Paper %T Group-realizable multi-group learning by minimizing empirical risk %A Navid Ardeshir %A Samuel Deng %A Daniel Hsu %A Jingwen Liu %B Proceedings of The 37th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2026 %E Matus Telgarsky %E Jonathan Ullman %F pmlr-v313-ardeshir26a %I PMLR %P 1--12 %U https://proceedings.mlr.press/v313/ardeshir26a.html %V 313 %X The sample complexity of multi-group learning is shown to improve in the group-realizable setting over the agnostic setting, even when the family of groups is infinite so long as it has finite VC dimension. The improved sample complexity is obtained by empirical risk minimization over the class of group-realizable concepts, which itself could have infinite VC dimension. Implementing this approach is also shown to be computationally intractable, and an alternative approach is suggested based on improper learning.
APA
Ardeshir, N., Deng, S., Hsu, D. & Liu, J.. (2026). Group-realizable multi-group learning by minimizing empirical risk. Proceedings of The 37th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 313:1-12 Available from https://proceedings.mlr.press/v313/ardeshir26a.html.

Related Material