Private and Federated Stochastic Convex Optimization: Efficient Strategies for Centralized Systems

Roie Reshef, Kfir Yehuda Levy
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:42521-42542, 2024.

Abstract

This paper addresses the challenge of preserving privacy in Federated Learning (FL) within centralized systems, focusing on both trusted and untrusted server scenarios. We analyze this setting within the Stochastic Convex Optimization (SCO) framework, and devise methods that ensure Differential Privacy (DP) while maintaining optimal convergence rates for homogeneous and heterogeneous data distributions. Our approach, based on a recent stochastic optimization technique, offers linear computational complexity, comparable to non-private FL methods, and reduced gradient obfuscation. This work enhances the practicality of DP in FL, balancing privacy, efficiency, and robustness in a variety of server trust environments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-reshef24a, title = {Private and Federated Stochastic Convex Optimization: Efficient Strategies for Centralized Systems}, author = {Reshef, Roie and Levy, Kfir Yehuda}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {42521--42542}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/reshef24a/reshef24a.pdf}, url = {https://proceedings.mlr.press/v235/reshef24a.html}, abstract = {This paper addresses the challenge of preserving privacy in Federated Learning (FL) within centralized systems, focusing on both trusted and untrusted server scenarios. We analyze this setting within the Stochastic Convex Optimization (SCO) framework, and devise methods that ensure Differential Privacy (DP) while maintaining optimal convergence rates for homogeneous and heterogeneous data distributions. Our approach, based on a recent stochastic optimization technique, offers linear computational complexity, comparable to non-private FL methods, and reduced gradient obfuscation. This work enhances the practicality of DP in FL, balancing privacy, efficiency, and robustness in a variety of server trust environments.} }
Endnote
%0 Conference Paper %T Private and Federated Stochastic Convex Optimization: Efficient Strategies for Centralized Systems %A Roie Reshef %A Kfir Yehuda Levy %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-reshef24a %I PMLR %P 42521--42542 %U https://proceedings.mlr.press/v235/reshef24a.html %V 235 %X This paper addresses the challenge of preserving privacy in Federated Learning (FL) within centralized systems, focusing on both trusted and untrusted server scenarios. We analyze this setting within the Stochastic Convex Optimization (SCO) framework, and devise methods that ensure Differential Privacy (DP) while maintaining optimal convergence rates for homogeneous and heterogeneous data distributions. Our approach, based on a recent stochastic optimization technique, offers linear computational complexity, comparable to non-private FL methods, and reduced gradient obfuscation. This work enhances the practicality of DP in FL, balancing privacy, efficiency, and robustness in a variety of server trust environments.
APA
Reshef, R. & Levy, K.Y.. (2024). Private and Federated Stochastic Convex Optimization: Efficient Strategies for Centralized Systems. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:42521-42542 Available from https://proceedings.mlr.press/v235/reshef24a.html.

Related Material