Tractable Search for Learning Exponential Models of Rankings

Bhushan Mandhani, Marina Meila
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:392-399, 2009.

Abstract

We consider the problem of learning the Generalized Mallows (GM) model of \citefv86, which represents a probability distribution over all possible permutations (aka rankings) of a given set of objects. The training data consists of a set of permutations. This problem generalizes the well known rank aggregation problem. Maximum Likelihood estimation of the GM model is NP-hard. An exact but inefficient search-based method was recently proposed for this problem. Here we introduce the first non-trivial heuristic function for this search. We justify it theoretically, and show why it is admissible in practice. We experimentally demonstrate its effectiveness, and show that it is superior to existing techniques for learning the GM model. We also show good performance of a family of faster approximate methods of search.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-mandhani09a, title = {Tractable Search for Learning Exponential Models of Rankings}, author = {Bhushan Mandhani and Marina Meila}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {392--399}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/mandhani09a/mandhani09a.pdf}, url = {http://proceedings.mlr.press/v5/mandhani09a.html}, abstract = {We consider the problem of learning the Generalized Mallows (GM) model of \citefv86, which represents a probability distribution over all possible permutations (aka rankings) of a given set of objects. The training data consists of a set of permutations. This problem generalizes the well known rank aggregation problem. Maximum Likelihood estimation of the GM model is NP-hard. An exact but inefficient search-based method was recently proposed for this problem. Here we introduce the first non-trivial heuristic function for this search. We justify it theoretically, and show why it is admissible in practice. We experimentally demonstrate its effectiveness, and show that it is superior to existing techniques for learning the GM model. We also show good performance of a family of faster approximate methods of search.} }
Endnote
%0 Conference Paper %T Tractable Search for Learning Exponential Models of Rankings %A Bhushan Mandhani %A Marina Meila %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-mandhani09a %I PMLR %J Proceedings of Machine Learning Research %P 392--399 %U http://proceedings.mlr.press %V 5 %W PMLR %X We consider the problem of learning the Generalized Mallows (GM) model of \citefv86, which represents a probability distribution over all possible permutations (aka rankings) of a given set of objects. The training data consists of a set of permutations. This problem generalizes the well known rank aggregation problem. Maximum Likelihood estimation of the GM model is NP-hard. An exact but inefficient search-based method was recently proposed for this problem. Here we introduce the first non-trivial heuristic function for this search. We justify it theoretically, and show why it is admissible in practice. We experimentally demonstrate its effectiveness, and show that it is superior to existing techniques for learning the GM model. We also show good performance of a family of faster approximate methods of search.
RIS
TY - CPAPER TI - Tractable Search for Learning Exponential Models of Rankings AU - Bhushan Mandhani AU - Marina Meila BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-mandhani09a PB - PMLR SP - 392 DP - PMLR EP - 399 L1 - http://proceedings.mlr.press/v5/mandhani09a/mandhani09a.pdf UR - http://proceedings.mlr.press/v5/mandhani09a.html AB - We consider the problem of learning the Generalized Mallows (GM) model of \citefv86, which represents a probability distribution over all possible permutations (aka rankings) of a given set of objects. The training data consists of a set of permutations. This problem generalizes the well known rank aggregation problem. Maximum Likelihood estimation of the GM model is NP-hard. An exact but inefficient search-based method was recently proposed for this problem. Here we introduce the first non-trivial heuristic function for this search. We justify it theoretically, and show why it is admissible in practice. We experimentally demonstrate its effectiveness, and show that it is superior to existing techniques for learning the GM model. We also show good performance of a family of faster approximate methods of search. ER -
APA
Mandhani, B. & Meila, M.. (2009). Tractable Search for Learning Exponential Models of Rankings. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:392-399

Related Material