Tractable Search for Learning Exponential Models of Rankings

Bhushan Mandhani, Marina Meila
Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, PMLR 5:392-399, 2009.

Abstract

We consider the problem of learning the Generalized Mallows (GM) model of [Fligner and Verducci, 1986], which represents a probability distribution over all possible permutations (aka rankings) of a given set of objects. The training data consists of a set of permutations. This problem generalizes the well known rank aggregation problem. Maximum Likelihood estimation of the GM model is NP-hard. An exact but inefficient search-based method was recently proposed for this problem. Here we introduce the first non-trivial heuristic function for this search. We justify it theoretically, and show why it is admissible in practice. We experimentally demonstrate its effectiveness, and show that it is superior to existing techniques for learning the GM model. We also show good performance of a family of faster approximate methods of search.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-mandhani09a, title = {Tractable Search for Learning Exponential Models of Rankings}, author = {Mandhani, Bhushan and Meila, Marina}, booktitle = {Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics}, pages = {392--399}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/mandhani09a/mandhani09a.pdf}, url = {https://proceedings.mlr.press/v5/mandhani09a.html}, abstract = {We consider the problem of learning the Generalized Mallows (GM) model of [Fligner and Verducci, 1986], which represents a probability distribution over all possible permutations (aka rankings) of a given set of objects. The training data consists of a set of permutations. This problem generalizes the well known rank aggregation problem. Maximum Likelihood estimation of the GM model is NP-hard. An exact but inefficient search-based method was recently proposed for this problem. Here we introduce the first non-trivial heuristic function for this search. We justify it theoretically, and show why it is admissible in practice. We experimentally demonstrate its effectiveness, and show that it is superior to existing techniques for learning the GM model. We also show good performance of a family of faster approximate methods of search.} }
Endnote
%0 Conference Paper %T Tractable Search for Learning Exponential Models of Rankings %A Bhushan Mandhani %A Marina Meila %B Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-mandhani09a %I PMLR %P 392--399 %U https://proceedings.mlr.press/v5/mandhani09a.html %V 5 %X We consider the problem of learning the Generalized Mallows (GM) model of [Fligner and Verducci, 1986], which represents a probability distribution over all possible permutations (aka rankings) of a given set of objects. The training data consists of a set of permutations. This problem generalizes the well known rank aggregation problem. Maximum Likelihood estimation of the GM model is NP-hard. An exact but inefficient search-based method was recently proposed for this problem. Here we introduce the first non-trivial heuristic function for this search. We justify it theoretically, and show why it is admissible in practice. We experimentally demonstrate its effectiveness, and show that it is superior to existing techniques for learning the GM model. We also show good performance of a family of faster approximate methods of search.
RIS
TY - CPAPER TI - Tractable Search for Learning Exponential Models of Rankings AU - Bhushan Mandhani AU - Marina Meila BT - Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-mandhani09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 392 EP - 399 L1 - http://proceedings.mlr.press/v5/mandhani09a/mandhani09a.pdf UR - https://proceedings.mlr.press/v5/mandhani09a.html AB - We consider the problem of learning the Generalized Mallows (GM) model of [Fligner and Verducci, 1986], which represents a probability distribution over all possible permutations (aka rankings) of a given set of objects. The training data consists of a set of permutations. This problem generalizes the well known rank aggregation problem. Maximum Likelihood estimation of the GM model is NP-hard. An exact but inefficient search-based method was recently proposed for this problem. Here we introduce the first non-trivial heuristic function for this search. We justify it theoretically, and show why it is admissible in practice. We experimentally demonstrate its effectiveness, and show that it is superior to existing techniques for learning the GM model. We also show good performance of a family of faster approximate methods of search. ER -
APA
Mandhani, B. & Meila, M.. (2009). Tractable Search for Learning Exponential Models of Rankings. Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:392-399 Available from https://proceedings.mlr.press/v5/mandhani09a.html.

Related Material