Selective Classification via One-Sided Prediction

Aditya Gangrade, Anil Kag, Venkatesh Saligrama
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:2179-2187, 2021.

Abstract

We propose a novel method for selective classification (SC), a problem which allows a classifier to abstain from predicting some instances, thus trading off accuracy against coverage (the fraction of instances predicted). In contrast to prior gating or confidence-set based work, our proposed method optimises a collection of class-wise decoupled one-sided empirical risks, and is in essence a method for explicitly finding the largest decision sets for each class that have few false positives. This one-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime, and further admits efficient implementation, leading to a flexible and principled method for SC. We theoretically derive generalization bounds for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-gangrade21a, title = { Selective Classification via One-Sided Prediction }, author = {Gangrade, Aditya and Kag, Anil and Saligrama, Venkatesh}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {2179--2187}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/gangrade21a/gangrade21a.pdf}, url = {https://proceedings.mlr.press/v130/gangrade21a.html}, abstract = { We propose a novel method for selective classification (SC), a problem which allows a classifier to abstain from predicting some instances, thus trading off accuracy against coverage (the fraction of instances predicted). In contrast to prior gating or confidence-set based work, our proposed method optimises a collection of class-wise decoupled one-sided empirical risks, and is in essence a method for explicitly finding the largest decision sets for each class that have few false positives. This one-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime, and further admits efficient implementation, leading to a flexible and principled method for SC. We theoretically derive generalization bounds for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels. } }
Endnote
%0 Conference Paper %T Selective Classification via One-Sided Prediction %A Aditya Gangrade %A Anil Kag %A Venkatesh Saligrama %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-gangrade21a %I PMLR %P 2179--2187 %U https://proceedings.mlr.press/v130/gangrade21a.html %V 130 %X We propose a novel method for selective classification (SC), a problem which allows a classifier to abstain from predicting some instances, thus trading off accuracy against coverage (the fraction of instances predicted). In contrast to prior gating or confidence-set based work, our proposed method optimises a collection of class-wise decoupled one-sided empirical risks, and is in essence a method for explicitly finding the largest decision sets for each class that have few false positives. This one-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime, and further admits efficient implementation, leading to a flexible and principled method for SC. We theoretically derive generalization bounds for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels.
APA
Gangrade, A., Kag, A. & Saligrama, V.. (2021). Selective Classification via One-Sided Prediction . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:2179-2187 Available from https://proceedings.mlr.press/v130/gangrade21a.html.

Related Material