Web-Search Ranking with Initialized Gradient Boosted Regression Trees

Ananth Mohan, Zheng Chen, Kilian Weinberger
Proceedings of the Learning to Rank Challenge, PMLR 14:77-89, 2011.

Abstract

In May 2010 Yahoo! Inc. hosted the Learning to Rank Challenge. This paper summarizes the approach by the highly placed team Washington University in St. Louis. We investigate Random Forests (RF) as a low-cost alternative algorithm to Gradient Boosted Regression Trees (GBRT) (the de facto standard of web-search ranking). We demonstrate that it yields surprisingly accurate ranking results – comparable to or better than GBRT. We combine the two algorithms by first learning a ranking function with RF and using it as initialization for GBRT. We refer to this setting as iGBRT. Following a recent discussion by ?, we show that the results of iGBRT can be improved upon even further when the web-search ranking task is cast as classification instead of regression. We provide an upper bound of the Expected Reciprocal Rank (?) in terms of classification error and demonstrate that iGBRT outperforms GBRT and RF on the Microsoft Learning to Rank and Yahoo Ranking Competition data sets with surprising consistency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v14-mohan11a, title = {Web-Search Ranking with Initialized Gradient Boosted Regression Trees}, author = {Mohan, Ananth and Chen, Zheng and Weinberger, Kilian}, booktitle = {Proceedings of the Learning to Rank Challenge}, pages = {77--89}, year = {2011}, editor = {Chapelle, Olivier and Chang, Yi and Liu, Tie-Yan}, volume = {14}, series = {Proceedings of Machine Learning Research}, address = {Haifa, Israel}, month = {25 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v14/mohan11a/mohan11a.pdf}, url = {https://proceedings.mlr.press/v14/mohan11a.html}, abstract = {In May 2010 Yahoo! Inc. hosted the Learning to Rank Challenge. This paper summarizes the approach by the highly placed team Washington University in St. Louis. We investigate Random Forests (RF) as a low-cost alternative algorithm to Gradient Boosted Regression Trees (GBRT) (the de facto standard of web-search ranking). We demonstrate that it yields surprisingly accurate ranking results – comparable to or better than GBRT. We combine the two algorithms by first learning a ranking function with RF and using it as initialization for GBRT. We refer to this setting as iGBRT. Following a recent discussion by ?, we show that the results of iGBRT can be improved upon even further when the web-search ranking task is cast as classification instead of regression. We provide an upper bound of the Expected Reciprocal Rank (?) in terms of classification error and demonstrate that iGBRT outperforms GBRT and RF on the Microsoft Learning to Rank and Yahoo Ranking Competition data sets with surprising consistency.} }
Endnote
%0 Conference Paper %T Web-Search Ranking with Initialized Gradient Boosted Regression Trees %A Ananth Mohan %A Zheng Chen %A Kilian Weinberger %B Proceedings of the Learning to Rank Challenge %C Proceedings of Machine Learning Research %D 2011 %E Olivier Chapelle %E Yi Chang %E Tie-Yan Liu %F pmlr-v14-mohan11a %I PMLR %P 77--89 %U https://proceedings.mlr.press/v14/mohan11a.html %V 14 %X In May 2010 Yahoo! Inc. hosted the Learning to Rank Challenge. This paper summarizes the approach by the highly placed team Washington University in St. Louis. We investigate Random Forests (RF) as a low-cost alternative algorithm to Gradient Boosted Regression Trees (GBRT) (the de facto standard of web-search ranking). We demonstrate that it yields surprisingly accurate ranking results – comparable to or better than GBRT. We combine the two algorithms by first learning a ranking function with RF and using it as initialization for GBRT. We refer to this setting as iGBRT. Following a recent discussion by ?, we show that the results of iGBRT can be improved upon even further when the web-search ranking task is cast as classification instead of regression. We provide an upper bound of the Expected Reciprocal Rank (?) in terms of classification error and demonstrate that iGBRT outperforms GBRT and RF on the Microsoft Learning to Rank and Yahoo Ranking Competition data sets with surprising consistency.
RIS
TY - CPAPER TI - Web-Search Ranking with Initialized Gradient Boosted Regression Trees AU - Ananth Mohan AU - Zheng Chen AU - Kilian Weinberger BT - Proceedings of the Learning to Rank Challenge DA - 2011/01/26 ED - Olivier Chapelle ED - Yi Chang ED - Tie-Yan Liu ID - pmlr-v14-mohan11a PB - PMLR DP - Proceedings of Machine Learning Research VL - 14 SP - 77 EP - 89 L1 - http://proceedings.mlr.press/v14/mohan11a/mohan11a.pdf UR - https://proceedings.mlr.press/v14/mohan11a.html AB - In May 2010 Yahoo! Inc. hosted the Learning to Rank Challenge. This paper summarizes the approach by the highly placed team Washington University in St. Louis. We investigate Random Forests (RF) as a low-cost alternative algorithm to Gradient Boosted Regression Trees (GBRT) (the de facto standard of web-search ranking). We demonstrate that it yields surprisingly accurate ranking results – comparable to or better than GBRT. We combine the two algorithms by first learning a ranking function with RF and using it as initialization for GBRT. We refer to this setting as iGBRT. Following a recent discussion by ?, we show that the results of iGBRT can be improved upon even further when the web-search ranking task is cast as classification instead of regression. We provide an upper bound of the Expected Reciprocal Rank (?) in terms of classification error and demonstrate that iGBRT outperforms GBRT and RF on the Microsoft Learning to Rank and Yahoo Ranking Competition data sets with surprising consistency. ER -
APA
Mohan, A., Chen, Z. & Weinberger, K.. (2011). Web-Search Ranking with Initialized Gradient Boosted Regression Trees. Proceedings of the Learning to Rank Challenge, in Proceedings of Machine Learning Research 14:77-89 Available from https://proceedings.mlr.press/v14/mohan11a.html.

Related Material