Implicit rate-constrained optimization of non-decomposable objectives

Abhishek Kumar, Harikrishna Narasimhan, Andrew Cotter
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:5861-5871, 2021.

Abstract

We consider a popular family of constrained optimization problems arising in machine learning that involve optimizing a non-decomposable evaluation metric with a certain thresholded form, while constraining another metric of interest. Examples of such problems include optimizing false negative rate at a fixed false positive rate, optimizing precision at a fixed recall, optimizing the area under the precision-recall or ROC curves, etc. Our key idea is to formulate a rate-constrained optimization that expresses the threshold parameter as a function of the model parameters via the Implicit Function theorem. We show how the resulting optimization problem can be solved using standard gradient based methods. Experiments on benchmark datasets demonstrate the effectiveness of our proposed method over existing state-of-the-art approaches for these problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-kumar21b, title = {Implicit rate-constrained optimization of non-decomposable objectives}, author = {Kumar, Abhishek and Narasimhan, Harikrishna and Cotter, Andrew}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {5861--5871}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/kumar21b/kumar21b.pdf}, url = {https://proceedings.mlr.press/v139/kumar21b.html}, abstract = {We consider a popular family of constrained optimization problems arising in machine learning that involve optimizing a non-decomposable evaluation metric with a certain thresholded form, while constraining another metric of interest. Examples of such problems include optimizing false negative rate at a fixed false positive rate, optimizing precision at a fixed recall, optimizing the area under the precision-recall or ROC curves, etc. Our key idea is to formulate a rate-constrained optimization that expresses the threshold parameter as a function of the model parameters via the Implicit Function theorem. We show how the resulting optimization problem can be solved using standard gradient based methods. Experiments on benchmark datasets demonstrate the effectiveness of our proposed method over existing state-of-the-art approaches for these problems.} }
Endnote
%0 Conference Paper %T Implicit rate-constrained optimization of non-decomposable objectives %A Abhishek Kumar %A Harikrishna Narasimhan %A Andrew Cotter %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-kumar21b %I PMLR %P 5861--5871 %U https://proceedings.mlr.press/v139/kumar21b.html %V 139 %X We consider a popular family of constrained optimization problems arising in machine learning that involve optimizing a non-decomposable evaluation metric with a certain thresholded form, while constraining another metric of interest. Examples of such problems include optimizing false negative rate at a fixed false positive rate, optimizing precision at a fixed recall, optimizing the area under the precision-recall or ROC curves, etc. Our key idea is to formulate a rate-constrained optimization that expresses the threshold parameter as a function of the model parameters via the Implicit Function theorem. We show how the resulting optimization problem can be solved using standard gradient based methods. Experiments on benchmark datasets demonstrate the effectiveness of our proposed method over existing state-of-the-art approaches for these problems.
APA
Kumar, A., Narasimhan, H. & Cotter, A.. (2021). Implicit rate-constrained optimization of non-decomposable objectives. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:5861-5871 Available from https://proceedings.mlr.press/v139/kumar21b.html.

Related Material