Representation Preserving Multiclass Agnostic to Realizable Reduction

Steve Hanneke, Qinglin Meng, Amirreza Shaeiri
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:21995-22008, 2025.

Abstract

We study the problem of multiclass classification when the number of labels can be unbounded within the PAC learning framework. Our main contribution is a theory that demonstrates a simple and elegant agnostic to realizable reduction for this framework. This resolves an open problem raised by the recent work of (Hopkins et al., 2022). Notably, our result is the first representation preserving multiclass agnostic to realizable reduction, in contrast with the compression based approach of the work of (David et al., 2017). Furthermore, our main theorem is stated in an abstract framework, called “Unified PAC Learning”, which encompasses a range of frameworks, including multiclass PAC learning, list PAC learning, and multilabel PAC learning. In addition, we explore representation preserving reductions to the realizable setting for two noise models, namely Massart noise and Tsybakov noise, in the multiclass PAC learning framework. We believe our technique may find other applications in ensuing studies of theoretical machine learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-hanneke25a, title = {Representation Preserving Multiclass Agnostic to Realizable Reduction}, author = {Hanneke, Steve and Meng, Qinglin and Shaeiri, Amirreza}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {21995--22008}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/hanneke25a/hanneke25a.pdf}, url = {https://proceedings.mlr.press/v267/hanneke25a.html}, abstract = {We study the problem of multiclass classification when the number of labels can be unbounded within the PAC learning framework. Our main contribution is a theory that demonstrates a simple and elegant agnostic to realizable reduction for this framework. This resolves an open problem raised by the recent work of (Hopkins et al., 2022). Notably, our result is the first representation preserving multiclass agnostic to realizable reduction, in contrast with the compression based approach of the work of (David et al., 2017). Furthermore, our main theorem is stated in an abstract framework, called “Unified PAC Learning”, which encompasses a range of frameworks, including multiclass PAC learning, list PAC learning, and multilabel PAC learning. In addition, we explore representation preserving reductions to the realizable setting for two noise models, namely Massart noise and Tsybakov noise, in the multiclass PAC learning framework. We believe our technique may find other applications in ensuing studies of theoretical machine learning.} }
Endnote
%0 Conference Paper %T Representation Preserving Multiclass Agnostic to Realizable Reduction %A Steve Hanneke %A Qinglin Meng %A Amirreza Shaeiri %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-hanneke25a %I PMLR %P 21995--22008 %U https://proceedings.mlr.press/v267/hanneke25a.html %V 267 %X We study the problem of multiclass classification when the number of labels can be unbounded within the PAC learning framework. Our main contribution is a theory that demonstrates a simple and elegant agnostic to realizable reduction for this framework. This resolves an open problem raised by the recent work of (Hopkins et al., 2022). Notably, our result is the first representation preserving multiclass agnostic to realizable reduction, in contrast with the compression based approach of the work of (David et al., 2017). Furthermore, our main theorem is stated in an abstract framework, called “Unified PAC Learning”, which encompasses a range of frameworks, including multiclass PAC learning, list PAC learning, and multilabel PAC learning. In addition, we explore representation preserving reductions to the realizable setting for two noise models, namely Massart noise and Tsybakov noise, in the multiclass PAC learning framework. We believe our technique may find other applications in ensuing studies of theoretical machine learning.
APA
Hanneke, S., Meng, Q. & Shaeiri, A.. (2025). Representation Preserving Multiclass Agnostic to Realizable Reduction. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:21995-22008 Available from https://proceedings.mlr.press/v267/hanneke25a.html.

Related Material