Variational Inference from Ranked Samples with Features

Yuan Guo, Jennifer Dy, Deniz Erdoğmuş, Jayashree Kalpathy-Cramer, Susan Ostmo, J. Peter Campbell, Michael F. Chiang, Stratis Ioannidis
Proceedings of The Eleventh Asian Conference on Machine Learning, PMLR 101:599-614, 2019.

Abstract

In many supervised learning settings, elicited labels comprise pairwise comparisons or rankings of samples. We propose a Bayesian inference model for ranking datasets, allowing us to take a probabilistic approach to ranking inference. Our probabilistic assumptions are motivated by, and consistent with, the so-called Plackett-Luce model. We propose a variational inference method to extract a closed-form Gaussian posterior distribution. We show experimentally that the resulting posterior yields more reliable ranking predictions compared to predictions via point estimates.

Cite this Paper


BibTeX
@InProceedings{pmlr-v101-guo19a, title = {Variational Inference from Ranked Samples with Features}, author = {Guo, Yuan and Dy, Jennifer and Erdo\u{g}mu\c{s}, Deniz and Kalpathy-Cramer, Jayashree and Ostmo, Susan and Campbell, J. Peter and Chiang, Michael F. and Ioannidis, Stratis}, booktitle = {Proceedings of The Eleventh Asian Conference on Machine Learning}, pages = {599--614}, year = {2019}, editor = {Lee, Wee Sun and Suzuki, Taiji}, volume = {101}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v101/guo19a/guo19a.pdf}, url = {https://proceedings.mlr.press/v101/guo19a.html}, abstract = {In many supervised learning settings, elicited labels comprise pairwise comparisons or rankings of samples. We propose a Bayesian inference model for ranking datasets, allowing us to take a probabilistic approach to ranking inference. Our probabilistic assumptions are motivated by, and consistent with, the so-called Plackett-Luce model. We propose a variational inference method to extract a closed-form Gaussian posterior distribution. We show experimentally that the resulting posterior yields more reliable ranking predictions compared to predictions via point estimates.} }
Endnote
%0 Conference Paper %T Variational Inference from Ranked Samples with Features %A Yuan Guo %A Jennifer Dy %A Deniz Erdoğmuş %A Jayashree Kalpathy-Cramer %A Susan Ostmo %A J. Peter Campbell %A Michael F. Chiang %A Stratis Ioannidis %B Proceedings of The Eleventh Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Wee Sun Lee %E Taiji Suzuki %F pmlr-v101-guo19a %I PMLR %P 599--614 %U https://proceedings.mlr.press/v101/guo19a.html %V 101 %X In many supervised learning settings, elicited labels comprise pairwise comparisons or rankings of samples. We propose a Bayesian inference model for ranking datasets, allowing us to take a probabilistic approach to ranking inference. Our probabilistic assumptions are motivated by, and consistent with, the so-called Plackett-Luce model. We propose a variational inference method to extract a closed-form Gaussian posterior distribution. We show experimentally that the resulting posterior yields more reliable ranking predictions compared to predictions via point estimates.
APA
Guo, Y., Dy, J., Erdoğmuş, D., Kalpathy-Cramer, J., Ostmo, S., Campbell, J.P., Chiang, M.F. & Ioannidis, S.. (2019). Variational Inference from Ranked Samples with Features. Proceedings of The Eleventh Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 101:599-614 Available from https://proceedings.mlr.press/v101/guo19a.html.

Related Material