Class-Weighted Classification: Trade-offs and Robust Approaches

Ziyu Xu, Chen Dan, Justin Khim, Pradeep Ravikumar
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:10544-10554, 2020.

Abstract

We consider imbalanced classification, the problem in which a label may have low marginal probability relative to other labels, by weighting losses according to the correct class. First, we examine the convergence rates of the expected excess weighted risk of plug-in classifiers where the weighting for the plug-in classifier and the risk may be different. This leads to irreducible errors that do not converge to the weighted Bayes risk, which motivates our consideration of robust risks. We define a robust risk that minimizes risk over a set of weightings, show excess risk bounds for this problem, and demonstrate that particular choices of the weighting set leads to a special instance of conditional value at risk (CVaR) from stochastic programming, which we call label conditional value at risk (LCVaR). Additionally, we generalize this weighting to derive a new robust risk problem that we call label heterogeneous conditional value at risk (LHCVaR). Finally, we empirically demonstrate the efficacy of LCVaR and LHCVaR on improving class conditional risks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-xu20b, title = {Class-Weighted Classification: Trade-offs and Robust Approaches}, author = {Xu, Ziyu and Dan, Chen and Khim, Justin and Ravikumar, Pradeep}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {10544--10554}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/xu20b/xu20b.pdf}, url = {https://proceedings.mlr.press/v119/xu20b.html}, abstract = {We consider imbalanced classification, the problem in which a label may have low marginal probability relative to other labels, by weighting losses according to the correct class. First, we examine the convergence rates of the expected excess weighted risk of plug-in classifiers where the weighting for the plug-in classifier and the risk may be different. This leads to irreducible errors that do not converge to the weighted Bayes risk, which motivates our consideration of robust risks. We define a robust risk that minimizes risk over a set of weightings, show excess risk bounds for this problem, and demonstrate that particular choices of the weighting set leads to a special instance of conditional value at risk (CVaR) from stochastic programming, which we call label conditional value at risk (LCVaR). Additionally, we generalize this weighting to derive a new robust risk problem that we call label heterogeneous conditional value at risk (LHCVaR). Finally, we empirically demonstrate the efficacy of LCVaR and LHCVaR on improving class conditional risks.} }
Endnote
%0 Conference Paper %T Class-Weighted Classification: Trade-offs and Robust Approaches %A Ziyu Xu %A Chen Dan %A Justin Khim %A Pradeep Ravikumar %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-xu20b %I PMLR %P 10544--10554 %U https://proceedings.mlr.press/v119/xu20b.html %V 119 %X We consider imbalanced classification, the problem in which a label may have low marginal probability relative to other labels, by weighting losses according to the correct class. First, we examine the convergence rates of the expected excess weighted risk of plug-in classifiers where the weighting for the plug-in classifier and the risk may be different. This leads to irreducible errors that do not converge to the weighted Bayes risk, which motivates our consideration of robust risks. We define a robust risk that minimizes risk over a set of weightings, show excess risk bounds for this problem, and demonstrate that particular choices of the weighting set leads to a special instance of conditional value at risk (CVaR) from stochastic programming, which we call label conditional value at risk (LCVaR). Additionally, we generalize this weighting to derive a new robust risk problem that we call label heterogeneous conditional value at risk (LHCVaR). Finally, we empirically demonstrate the efficacy of LCVaR and LHCVaR on improving class conditional risks.
APA
Xu, Z., Dan, C., Khim, J. & Ravikumar, P.. (2020). Class-Weighted Classification: Trade-offs and Robust Approaches. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:10544-10554 Available from https://proceedings.mlr.press/v119/xu20b.html.

Related Material