Private Alternating Least Squares: Practical Private Matrix Completion with Tighter Rates

Steve Chien, Prateek Jain, Walid Krichene, Steffen Rendle, Shuang Song, Abhradeep Thakurta, Li Zhang
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:1877-1887, 2021.

Abstract

We study the problem of differentially private (DP) matrix completion under user-level privacy. We design a joint differentially private variant of the popular Alternating-Least-Squares (ALS) method that achieves: i) (nearly) optimal sample complexity for matrix completion (in terms of number of items, users), and ii) the best known privacy/utility trade-off both theoretically, as well as on benchmark data sets. In particular, we provide the first global convergence analysis of ALS with noise introduced to ensure DP, and show that, in comparison to the best known alternative (the Private Frank-Wolfe algorithm by Jain et al. (2018)), our error bounds scale significantly better with respect to the number of items and users, which is critical in practical problems. Extensive validation on standard benchmarks demonstrate that the algorithm, in combination with carefully designed sampling procedures, is significantly more accurate than existing techniques, thus promising to be the first practical DP embedding model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-chien21a, title = {Private Alternating Least Squares: Practical Private Matrix Completion with Tighter Rates}, author = {Chien, Steve and Jain, Prateek and Krichene, Walid and Rendle, Steffen and Song, Shuang and Thakurta, Abhradeep and Zhang, Li}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {1877--1887}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/chien21a/chien21a.pdf}, url = {http://proceedings.mlr.press/v139/chien21a.html}, abstract = {We study the problem of differentially private (DP) matrix completion under user-level privacy. We design a joint differentially private variant of the popular Alternating-Least-Squares (ALS) method that achieves: i) (nearly) optimal sample complexity for matrix completion (in terms of number of items, users), and ii) the best known privacy/utility trade-off both theoretically, as well as on benchmark data sets. In particular, we provide the first global convergence analysis of ALS with noise introduced to ensure DP, and show that, in comparison to the best known alternative (the Private Frank-Wolfe algorithm by Jain et al. (2018)), our error bounds scale significantly better with respect to the number of items and users, which is critical in practical problems. Extensive validation on standard benchmarks demonstrate that the algorithm, in combination with carefully designed sampling procedures, is significantly more accurate than existing techniques, thus promising to be the first practical DP embedding model.} }
Endnote
%0 Conference Paper %T Private Alternating Least Squares: Practical Private Matrix Completion with Tighter Rates %A Steve Chien %A Prateek Jain %A Walid Krichene %A Steffen Rendle %A Shuang Song %A Abhradeep Thakurta %A Li Zhang %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-chien21a %I PMLR %P 1877--1887 %U http://proceedings.mlr.press/v139/chien21a.html %V 139 %X We study the problem of differentially private (DP) matrix completion under user-level privacy. We design a joint differentially private variant of the popular Alternating-Least-Squares (ALS) method that achieves: i) (nearly) optimal sample complexity for matrix completion (in terms of number of items, users), and ii) the best known privacy/utility trade-off both theoretically, as well as on benchmark data sets. In particular, we provide the first global convergence analysis of ALS with noise introduced to ensure DP, and show that, in comparison to the best known alternative (the Private Frank-Wolfe algorithm by Jain et al. (2018)), our error bounds scale significantly better with respect to the number of items and users, which is critical in practical problems. Extensive validation on standard benchmarks demonstrate that the algorithm, in combination with carefully designed sampling procedures, is significantly more accurate than existing techniques, thus promising to be the first practical DP embedding model.
APA
Chien, S., Jain, P., Krichene, W., Rendle, S., Song, S., Thakurta, A. & Zhang, L.. (2021). Private Alternating Least Squares: Practical Private Matrix Completion with Tighter Rates. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:1877-1887 Available from http://proceedings.mlr.press/v139/chien21a.html.

Related Material