Learning Mixtures of Plackett-Luce Models

Zhibing Zhao, Peter Piech, Lirong Xia
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:2906-2914, 2016.

Abstract

In this paper we address the identifiability and efficient learning problems of finite mixtures of Plackett-Luce models for rank data. We prove that for any k≥2, the mixture of k Plackett-Luce models for no more than 2k-1 alternatives is non-identifiable and this bound is tight for k=2. For generic identifiability, we prove that the mixture of k Plackett-Luce models over m alternatives is \em generically identifiable if k≤⌊\frac m-2 2⌋!. We also propose an efficient generalized method of moments (GMM) algorithm to learn the mixture of two Plackett-Luce models and show that the algorithm is consistent. Our experiments show that our GMM algorithm is significantly faster than the EMM algorithm by Gormley & Murphy (2008), while achieving competitive statistical efficiency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-zhaob16, title = {Learning Mixtures of Plackett-Luce Models}, author = {Zhao, Zhibing and Piech, Peter and Xia, Lirong}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {2906--2914}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/zhaob16.pdf}, url = {https://proceedings.mlr.press/v48/zhaob16.html}, abstract = {In this paper we address the identifiability and efficient learning problems of finite mixtures of Plackett-Luce models for rank data. We prove that for any k≥2, the mixture of k Plackett-Luce models for no more than 2k-1 alternatives is non-identifiable and this bound is tight for k=2. For generic identifiability, we prove that the mixture of k Plackett-Luce models over m alternatives is \em generically identifiable if k≤⌊\frac m-2 2⌋!. We also propose an efficient generalized method of moments (GMM) algorithm to learn the mixture of two Plackett-Luce models and show that the algorithm is consistent. Our experiments show that our GMM algorithm is significantly faster than the EMM algorithm by Gormley & Murphy (2008), while achieving competitive statistical efficiency.} }
Endnote
%0 Conference Paper %T Learning Mixtures of Plackett-Luce Models %A Zhibing Zhao %A Peter Piech %A Lirong Xia %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-zhaob16 %I PMLR %P 2906--2914 %U https://proceedings.mlr.press/v48/zhaob16.html %V 48 %X In this paper we address the identifiability and efficient learning problems of finite mixtures of Plackett-Luce models for rank data. We prove that for any k≥2, the mixture of k Plackett-Luce models for no more than 2k-1 alternatives is non-identifiable and this bound is tight for k=2. For generic identifiability, we prove that the mixture of k Plackett-Luce models over m alternatives is \em generically identifiable if k≤⌊\frac m-2 2⌋!. We also propose an efficient generalized method of moments (GMM) algorithm to learn the mixture of two Plackett-Luce models and show that the algorithm is consistent. Our experiments show that our GMM algorithm is significantly faster than the EMM algorithm by Gormley & Murphy (2008), while achieving competitive statistical efficiency.
RIS
TY - CPAPER TI - Learning Mixtures of Plackett-Luce Models AU - Zhibing Zhao AU - Peter Piech AU - Lirong Xia BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-zhaob16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 2906 EP - 2914 L1 - http://proceedings.mlr.press/v48/zhaob16.pdf UR - https://proceedings.mlr.press/v48/zhaob16.html AB - In this paper we address the identifiability and efficient learning problems of finite mixtures of Plackett-Luce models for rank data. We prove that for any k≥2, the mixture of k Plackett-Luce models for no more than 2k-1 alternatives is non-identifiable and this bound is tight for k=2. For generic identifiability, we prove that the mixture of k Plackett-Luce models over m alternatives is \em generically identifiable if k≤⌊\frac m-2 2⌋!. We also propose an efficient generalized method of moments (GMM) algorithm to learn the mixture of two Plackett-Luce models and show that the algorithm is consistent. Our experiments show that our GMM algorithm is significantly faster than the EMM algorithm by Gormley & Murphy (2008), while achieving competitive statistical efficiency. ER -
APA
Zhao, Z., Piech, P. & Xia, L.. (2016). Learning Mixtures of Plackett-Luce Models. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:2906-2914 Available from https://proceedings.mlr.press/v48/zhaob16.html.

Related Material