From Noisy Fixed-Point Iterations to Private ADMM for Centralized and Federated Learning

Edwige Cyffers, Aurélien Bellet, Debabrota Basu
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:6683-6711, 2023.

Abstract

We study differentially private (DP) machine learning algorithms as instances of noisy fixed-point iterations, in order to derive privacy and utility results from this well-studied framework. We show that this new perspective recovers popular private gradient-based methods like DP-SGD and provides a principled way to design and analyze new private optimization algorithms in a flexible manner. Focusing on the widely-used Alternating Directions Method of Multipliers (ADMM) method, we use our general framework derive novel private ADMM algorithms for centralized, federated and fully decentralized learning. We establish strong privacy guarantees for these algorithms, leveraging privacy amplification by iteration and by subsampling. Finally, we provide utility guarantees for the three algorithms using a unified analysis that exploits a recent linear convergence result for noisy fixed-point iterations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-cyffers23a, title = {From Noisy Fixed-Point Iterations to Private {ADMM} for Centralized and Federated Learning}, author = {Cyffers, Edwige and Bellet, Aur\'{e}lien and Basu, Debabrota}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {6683--6711}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/cyffers23a/cyffers23a.pdf}, url = {https://proceedings.mlr.press/v202/cyffers23a.html}, abstract = {We study differentially private (DP) machine learning algorithms as instances of noisy fixed-point iterations, in order to derive privacy and utility results from this well-studied framework. We show that this new perspective recovers popular private gradient-based methods like DP-SGD and provides a principled way to design and analyze new private optimization algorithms in a flexible manner. Focusing on the widely-used Alternating Directions Method of Multipliers (ADMM) method, we use our general framework derive novel private ADMM algorithms for centralized, federated and fully decentralized learning. We establish strong privacy guarantees for these algorithms, leveraging privacy amplification by iteration and by subsampling. Finally, we provide utility guarantees for the three algorithms using a unified analysis that exploits a recent linear convergence result for noisy fixed-point iterations.} }
Endnote
%0 Conference Paper %T From Noisy Fixed-Point Iterations to Private ADMM for Centralized and Federated Learning %A Edwige Cyffers %A Aurélien Bellet %A Debabrota Basu %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-cyffers23a %I PMLR %P 6683--6711 %U https://proceedings.mlr.press/v202/cyffers23a.html %V 202 %X We study differentially private (DP) machine learning algorithms as instances of noisy fixed-point iterations, in order to derive privacy and utility results from this well-studied framework. We show that this new perspective recovers popular private gradient-based methods like DP-SGD and provides a principled way to design and analyze new private optimization algorithms in a flexible manner. Focusing on the widely-used Alternating Directions Method of Multipliers (ADMM) method, we use our general framework derive novel private ADMM algorithms for centralized, federated and fully decentralized learning. We establish strong privacy guarantees for these algorithms, leveraging privacy amplification by iteration and by subsampling. Finally, we provide utility guarantees for the three algorithms using a unified analysis that exploits a recent linear convergence result for noisy fixed-point iterations.
APA
Cyffers, E., Bellet, A. & Basu, D.. (2023). From Noisy Fixed-Point Iterations to Private ADMM for Centralized and Federated Learning. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:6683-6711 Available from https://proceedings.mlr.press/v202/cyffers23a.html.

Related Material