Active Learning for Cost-Sensitive Classification

Akshay Krishnamurthy, Alekh Agarwal, Tzu-Kuo Huang, Hal Daumé III, John Langford
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1915-1924, 2017.

Abstract

We design an active learning algorithm for cost-sensitive multiclass classification: problems where different errors have different costs. Our algorithm, COAL, makes predictions by regressing to each label’s cost and predicting the smallest. On a new example, it uses a set of regressors that perform well on past data to estimate possible costs for each label. It queries only the labels that could be the best, ignoring the sure losers. We prove COAL can be efficiently implemented for any regression family that admits squared loss optimization; it also enjoys strong guarantees with respect to predictive performance and labeling effort. Our experiment with COAL show significant improvements in labeling effort and test cost over passive and active baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-krishnamurthy17a, title = {Active Learning for Cost-Sensitive Classification}, author = {Akshay Krishnamurthy and Alekh Agarwal and Tzu-Kuo Huang and Daum{\'e}, III, Hal and John Langford}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {1915--1924}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/krishnamurthy17a/krishnamurthy17a.pdf}, url = {https://proceedings.mlr.press/v70/krishnamurthy17a.html}, abstract = {We design an active learning algorithm for cost-sensitive multiclass classification: problems where different errors have different costs. Our algorithm, COAL, makes predictions by regressing to each label’s cost and predicting the smallest. On a new example, it uses a set of regressors that perform well on past data to estimate possible costs for each label. It queries only the labels that could be the best, ignoring the sure losers. We prove COAL can be efficiently implemented for any regression family that admits squared loss optimization; it also enjoys strong guarantees with respect to predictive performance and labeling effort. Our experiment with COAL show significant improvements in labeling effort and test cost over passive and active baselines.} }
Endnote
%0 Conference Paper %T Active Learning for Cost-Sensitive Classification %A Akshay Krishnamurthy %A Alekh Agarwal %A Tzu-Kuo Huang %A Hal Daumé, III %A John Langford %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-krishnamurthy17a %I PMLR %P 1915--1924 %U https://proceedings.mlr.press/v70/krishnamurthy17a.html %V 70 %X We design an active learning algorithm for cost-sensitive multiclass classification: problems where different errors have different costs. Our algorithm, COAL, makes predictions by regressing to each label’s cost and predicting the smallest. On a new example, it uses a set of regressors that perform well on past data to estimate possible costs for each label. It queries only the labels that could be the best, ignoring the sure losers. We prove COAL can be efficiently implemented for any regression family that admits squared loss optimization; it also enjoys strong guarantees with respect to predictive performance and labeling effort. Our experiment with COAL show significant improvements in labeling effort and test cost over passive and active baselines.
APA
Krishnamurthy, A., Agarwal, A., Huang, T., Daumé, III, H. & Langford, J.. (2017). Active Learning for Cost-Sensitive Classification. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:1915-1924 Available from https://proceedings.mlr.press/v70/krishnamurthy17a.html.

Related Material