StochasticRank: Global Optimization of Scale-Free Discrete Functions

Aleksei Ustimenko, Liudmila Prokhorenkova
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:9669-9679, 2020.

Abstract

In this paper, we introduce a powerful and efficient framework for direct optimization of ranking metrics. The problem is ill-posed due to the discrete structure of the loss, and to deal with that, we introduce two important techniques: stochastic smoothing and novel gradient estimate based on partial integration. We show that classic smoothing approaches may introduce bias and present a universal solution for a proper debiasing. Importantly, we can guarantee global convergence of our method by adopting a recently proposed Stochastic Gradient Langevin Boosting algorithm. Our algorithm is implemented as a part of the CatBoost gradient boosting library and outperforms the existing approaches on several learning-to-rank datasets. In addition to ranking metrics, our framework applies to any scale-free discrete loss function.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-ustimenko20a, title = {{S}tochastic{R}ank: Global Optimization of Scale-Free Discrete Functions}, author = {Ustimenko, Aleksei and Prokhorenkova, Liudmila}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {9669--9679}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/ustimenko20a/ustimenko20a.pdf}, url = {http://proceedings.mlr.press/v119/ustimenko20a.html}, abstract = {In this paper, we introduce a powerful and efficient framework for direct optimization of ranking metrics. The problem is ill-posed due to the discrete structure of the loss, and to deal with that, we introduce two important techniques: stochastic smoothing and novel gradient estimate based on partial integration. We show that classic smoothing approaches may introduce bias and present a universal solution for a proper debiasing. Importantly, we can guarantee global convergence of our method by adopting a recently proposed Stochastic Gradient Langevin Boosting algorithm. Our algorithm is implemented as a part of the CatBoost gradient boosting library and outperforms the existing approaches on several learning-to-rank datasets. In addition to ranking metrics, our framework applies to any scale-free discrete loss function.} }
Endnote
%0 Conference Paper %T StochasticRank: Global Optimization of Scale-Free Discrete Functions %A Aleksei Ustimenko %A Liudmila Prokhorenkova %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-ustimenko20a %I PMLR %P 9669--9679 %U http://proceedings.mlr.press/v119/ustimenko20a.html %V 119 %X In this paper, we introduce a powerful and efficient framework for direct optimization of ranking metrics. The problem is ill-posed due to the discrete structure of the loss, and to deal with that, we introduce two important techniques: stochastic smoothing and novel gradient estimate based on partial integration. We show that classic smoothing approaches may introduce bias and present a universal solution for a proper debiasing. Importantly, we can guarantee global convergence of our method by adopting a recently proposed Stochastic Gradient Langevin Boosting algorithm. Our algorithm is implemented as a part of the CatBoost gradient boosting library and outperforms the existing approaches on several learning-to-rank datasets. In addition to ranking metrics, our framework applies to any scale-free discrete loss function.
APA
Ustimenko, A. & Prokhorenkova, L.. (2020). StochasticRank: Global Optimization of Scale-Free Discrete Functions. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:9669-9679 Available from http://proceedings.mlr.press/v119/ustimenko20a.html.

Related Material