Recommendations with Sparse Comparison Data: Provably Fast Convergence for Nonconvex Matrix Factorization

Suryanarayana Sankagiri, Jalal Etesami, Matthias Grossglauser
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:52798-52832, 2025.

Abstract

In this paper, we consider a recommender system that elicits user feedback through pairwise comparisons instead of ratings. We study the problem of learning personalised preferences from such comparison data via collaborative filtering. Similar to the classical matrix completion setting, we assume that users and items are endowed with low-dimensional latent features. These features give rise to user-item utilities, and the comparison outcomes are governed by a discrete choice model over these utilities. The task of learning these features is then formulated as a maximum likelihood problem over the comparison dataset. Despite the resulting optimization problem being nonconvex, we show that gradient-based methods converge exponentially to the latent features, given a warm start. Importantly, this result holds in a sparse data regime, where each user compares only a few pairs of items. Our main technical contribution is to extend key concentration results commonly used in matrix completion to our model. Simulations reveal that the empirical performance of the method exceeds theoretical predictions, even when some assumptions are relaxed. Our work demonstrates that learning personalised recommendations from comparison data is both computationally and statistically efficient.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-sankagiri25a, title = {Recommendations with Sparse Comparison Data: Provably Fast Convergence for Nonconvex Matrix Factorization}, author = {Sankagiri, Suryanarayana and Etesami, Jalal and Grossglauser, Matthias}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {52798--52832}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/sankagiri25a/sankagiri25a.pdf}, url = {https://proceedings.mlr.press/v267/sankagiri25a.html}, abstract = {In this paper, we consider a recommender system that elicits user feedback through pairwise comparisons instead of ratings. We study the problem of learning personalised preferences from such comparison data via collaborative filtering. Similar to the classical matrix completion setting, we assume that users and items are endowed with low-dimensional latent features. These features give rise to user-item utilities, and the comparison outcomes are governed by a discrete choice model over these utilities. The task of learning these features is then formulated as a maximum likelihood problem over the comparison dataset. Despite the resulting optimization problem being nonconvex, we show that gradient-based methods converge exponentially to the latent features, given a warm start. Importantly, this result holds in a sparse data regime, where each user compares only a few pairs of items. Our main technical contribution is to extend key concentration results commonly used in matrix completion to our model. Simulations reveal that the empirical performance of the method exceeds theoretical predictions, even when some assumptions are relaxed. Our work demonstrates that learning personalised recommendations from comparison data is both computationally and statistically efficient.} }
Endnote
%0 Conference Paper %T Recommendations with Sparse Comparison Data: Provably Fast Convergence for Nonconvex Matrix Factorization %A Suryanarayana Sankagiri %A Jalal Etesami %A Matthias Grossglauser %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-sankagiri25a %I PMLR %P 52798--52832 %U https://proceedings.mlr.press/v267/sankagiri25a.html %V 267 %X In this paper, we consider a recommender system that elicits user feedback through pairwise comparisons instead of ratings. We study the problem of learning personalised preferences from such comparison data via collaborative filtering. Similar to the classical matrix completion setting, we assume that users and items are endowed with low-dimensional latent features. These features give rise to user-item utilities, and the comparison outcomes are governed by a discrete choice model over these utilities. The task of learning these features is then formulated as a maximum likelihood problem over the comparison dataset. Despite the resulting optimization problem being nonconvex, we show that gradient-based methods converge exponentially to the latent features, given a warm start. Importantly, this result holds in a sparse data regime, where each user compares only a few pairs of items. Our main technical contribution is to extend key concentration results commonly used in matrix completion to our model. Simulations reveal that the empirical performance of the method exceeds theoretical predictions, even when some assumptions are relaxed. Our work demonstrates that learning personalised recommendations from comparison data is both computationally and statistically efficient.
APA
Sankagiri, S., Etesami, J. & Grossglauser, M.. (2025). Recommendations with Sparse Comparison Data: Provably Fast Convergence for Nonconvex Matrix Factorization. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:52798-52832 Available from https://proceedings.mlr.press/v267/sankagiri25a.html.

Related Material