User-level Differentially Private Stochastic Convex Optimization: Efficient Algorithms with Optimal Rates

Daogao Liu, Hilal Asi
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4240-4248, 2024.

Abstract

We study differentially private stochastic convex optimization (DP-SCO) under user-level privacy, where each user may hold multiple data items. Existing work for user-level DP-SCO either requires super-polynomial runtime (Ghazi et al., 2023) or requires the number of users to grow polynomially with the dimensionality of the problem with additional strict assumptions (Bassily et al., 2023). We develop new algorithms for user-level DP-SCO that obtain optimal rates for both convex and strongly convex functions in polynomial time and require the number of users to grow only logarithmically in the dimension. Moreover, our algorithms are the first to obtain optimal rates for non-smooth functions in polynomial time. These algorithms are based on multiple-pass DP-SGD, combined with a novel private mean estimation procedure for concentrated data, which applies an outlier removal step before estimating the mean of the gradients.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-liu24g, title = { User-level Differentially Private Stochastic Convex Optimization: Efficient Algorithms with Optimal Rates }, author = {Liu, Daogao and Asi, Hilal}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {4240--4248}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/liu24g/liu24g.pdf}, url = {https://proceedings.mlr.press/v238/liu24g.html}, abstract = { We study differentially private stochastic convex optimization (DP-SCO) under user-level privacy, where each user may hold multiple data items. Existing work for user-level DP-SCO either requires super-polynomial runtime (Ghazi et al., 2023) or requires the number of users to grow polynomially with the dimensionality of the problem with additional strict assumptions (Bassily et al., 2023). We develop new algorithms for user-level DP-SCO that obtain optimal rates for both convex and strongly convex functions in polynomial time and require the number of users to grow only logarithmically in the dimension. Moreover, our algorithms are the first to obtain optimal rates for non-smooth functions in polynomial time. These algorithms are based on multiple-pass DP-SGD, combined with a novel private mean estimation procedure for concentrated data, which applies an outlier removal step before estimating the mean of the gradients. } }
Endnote
%0 Conference Paper %T User-level Differentially Private Stochastic Convex Optimization: Efficient Algorithms with Optimal Rates %A Daogao Liu %A Hilal Asi %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-liu24g %I PMLR %P 4240--4248 %U https://proceedings.mlr.press/v238/liu24g.html %V 238 %X We study differentially private stochastic convex optimization (DP-SCO) under user-level privacy, where each user may hold multiple data items. Existing work for user-level DP-SCO either requires super-polynomial runtime (Ghazi et al., 2023) or requires the number of users to grow polynomially with the dimensionality of the problem with additional strict assumptions (Bassily et al., 2023). We develop new algorithms for user-level DP-SCO that obtain optimal rates for both convex and strongly convex functions in polynomial time and require the number of users to grow only logarithmically in the dimension. Moreover, our algorithms are the first to obtain optimal rates for non-smooth functions in polynomial time. These algorithms are based on multiple-pass DP-SGD, combined with a novel private mean estimation procedure for concentrated data, which applies an outlier removal step before estimating the mean of the gradients.
APA
Liu, D. & Asi, H.. (2024). User-level Differentially Private Stochastic Convex Optimization: Efficient Algorithms with Optimal Rates . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:4240-4248 Available from https://proceedings.mlr.press/v238/liu24g.html.

Related Material