Metric-Optimized Example Weights

Sen Zhao, Mahdi Milani Fard, Harikrishna Narasimhan, Maya Gupta
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7533-7542, 2019.

Abstract

Real-world machine learning applications often have complex test metrics, and may have training and test data that are not identically distributed. Motivated by known connections between complex test metrics and cost-weighted learning, we propose addressing these issues by using a weighted loss function with a standard loss, where the weights on the training examples are learned to optimize the test metric on a validation set. These metric-optimized example weights can be learned for any test metric, including black box and customized ones for specific applications. We illustrate the performance of the proposed method on diverse public benchmark datasets and real-world applications. We also provide a generalization bound for the method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-zhao19b, title = {Metric-Optimized Example Weights}, author = {Zhao, Sen and Fard, Mahdi Milani and Narasimhan, Harikrishna and Gupta, Maya}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7533--7542}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/zhao19b/zhao19b.pdf}, url = {https://proceedings.mlr.press/v97/zhao19b.html}, abstract = {Real-world machine learning applications often have complex test metrics, and may have training and test data that are not identically distributed. Motivated by known connections between complex test metrics and cost-weighted learning, we propose addressing these issues by using a weighted loss function with a standard loss, where the weights on the training examples are learned to optimize the test metric on a validation set. These metric-optimized example weights can be learned for any test metric, including black box and customized ones for specific applications. We illustrate the performance of the proposed method on diverse public benchmark datasets and real-world applications. We also provide a generalization bound for the method.} }
Endnote
%0 Conference Paper %T Metric-Optimized Example Weights %A Sen Zhao %A Mahdi Milani Fard %A Harikrishna Narasimhan %A Maya Gupta %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-zhao19b %I PMLR %P 7533--7542 %U https://proceedings.mlr.press/v97/zhao19b.html %V 97 %X Real-world machine learning applications often have complex test metrics, and may have training and test data that are not identically distributed. Motivated by known connections between complex test metrics and cost-weighted learning, we propose addressing these issues by using a weighted loss function with a standard loss, where the weights on the training examples are learned to optimize the test metric on a validation set. These metric-optimized example weights can be learned for any test metric, including black box and customized ones for specific applications. We illustrate the performance of the proposed method on diverse public benchmark datasets and real-world applications. We also provide a generalization bound for the method.
APA
Zhao, S., Fard, M.M., Narasimhan, H. & Gupta, M.. (2019). Metric-Optimized Example Weights. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7533-7542 Available from https://proceedings.mlr.press/v97/zhao19b.html.

Related Material