Scalable Learning of Non-Decomposable Objectives

Elad Eban, Mariano Schain, Alan Mackey, Ariel Gordon, Ryan Rifkin, Gal Elidan
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, PMLR 54:832-840, 2017.

Abstract

Modern retrieval systems are often driven by an underlying machine learning model. The goal of such systems is to identify and possibly rank the few most relevant items for a given query or context. Thus, such systems are typically evaluated using a ranking-based performance metric such as the area under the precision-recall curve, the F score, precision at fixed recall, etc. Obviously, it is desirable to train such systems to optimize the metric of interest. In practice, due to the scalability limitations of existing approaches for optimizing such objectives, large-scale retrieval systems are instead trained to maximize classification accuracy, in the hope that performance as measured via the true objective will also be favorable. In this work we present a unified framework that, using straightforward building block bounds, allows for highly scalable optimization of a wide range of ranking-based objectives. We demonstrate the advantage of our approach on several real-life retrieval problems that are significantly larger than those considered in the literature, while achieving substantial improvement in performance over the accuracy-objective baseline.

Cite this Paper


BibTeX
@InProceedings{pmlr-v54-eban17a, title = {{Scalable Learning of Non-Decomposable Objectives}}, author = {Eban, Elad and Schain, Mariano and Mackey, Alan and Gordon, Ariel and Rifkin, Ryan and Elidan, Gal}, booktitle = {Proceedings of the 20th International Conference on Artificial Intelligence and Statistics}, pages = {832--840}, year = {2017}, editor = {Singh, Aarti and Zhu, Jerry}, volume = {54}, series = {Proceedings of Machine Learning Research}, month = {20--22 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v54/eban17a/eban17a.pdf}, url = {https://proceedings.mlr.press/v54/eban17a.html}, abstract = {Modern retrieval systems are often driven by an underlying machine learning model. The goal of such systems is to identify and possibly rank the few most relevant items for a given query or context. Thus, such systems are typically evaluated using a ranking-based performance metric such as the area under the precision-recall curve, the F score, precision at fixed recall, etc. Obviously, it is desirable to train such systems to optimize the metric of interest. In practice, due to the scalability limitations of existing approaches for optimizing such objectives, large-scale retrieval systems are instead trained to maximize classification accuracy, in the hope that performance as measured via the true objective will also be favorable. In this work we present a unified framework that, using straightforward building block bounds, allows for highly scalable optimization of a wide range of ranking-based objectives. We demonstrate the advantage of our approach on several real-life retrieval problems that are significantly larger than those considered in the literature, while achieving substantial improvement in performance over the accuracy-objective baseline. } }
Endnote
%0 Conference Paper %T Scalable Learning of Non-Decomposable Objectives %A Elad Eban %A Mariano Schain %A Alan Mackey %A Ariel Gordon %A Ryan Rifkin %A Gal Elidan %B Proceedings of the 20th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2017 %E Aarti Singh %E Jerry Zhu %F pmlr-v54-eban17a %I PMLR %P 832--840 %U https://proceedings.mlr.press/v54/eban17a.html %V 54 %X Modern retrieval systems are often driven by an underlying machine learning model. The goal of such systems is to identify and possibly rank the few most relevant items for a given query or context. Thus, such systems are typically evaluated using a ranking-based performance metric such as the area under the precision-recall curve, the F score, precision at fixed recall, etc. Obviously, it is desirable to train such systems to optimize the metric of interest. In practice, due to the scalability limitations of existing approaches for optimizing such objectives, large-scale retrieval systems are instead trained to maximize classification accuracy, in the hope that performance as measured via the true objective will also be favorable. In this work we present a unified framework that, using straightforward building block bounds, allows for highly scalable optimization of a wide range of ranking-based objectives. We demonstrate the advantage of our approach on several real-life retrieval problems that are significantly larger than those considered in the literature, while achieving substantial improvement in performance over the accuracy-objective baseline.
APA
Eban, E., Schain, M., Mackey, A., Gordon, A., Rifkin, R. & Elidan, G.. (2017). Scalable Learning of Non-Decomposable Objectives. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 54:832-840 Available from https://proceedings.mlr.press/v54/eban17a.html.

Related Material