Differentially Private Decentralized Learning with Random Walks

Edwige Cyffers, Aurélien Bellet, Jalaj Upadhyay
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:9762-9783, 2024.

Abstract

The popularity of federated learning comes from the possibility of better scalability and the ability for participants to keep control of their data, improving data security and sovereignty. Unfortunately, sharing model updates also creates a new privacy attack surface. In this work, we characterize the privacy guarantees of decentralized learning with random walk algorithms, where a model is updated by traveling from one node to another along the edges of a communication graph. Using a recent variant of differential privacy tailored to the study of decentralized algorithms, namely Pairwise Network Differential Privacy, we derive closed-form expressions for the privacy loss between each pair of nodes where the impact of the communication topology is captured by graph theoretic quantities. Our results further reveal that random walk algorithms tends to yield better privacy guarantees than gossip algorithms for nodes close from each other. We supplement our theoretical results with empirical evaluation on synthetic and real-world graphs and datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-cyffers24a, title = {Differentially Private Decentralized Learning with Random Walks}, author = {Cyffers, Edwige and Bellet, Aur\'{e}lien and Upadhyay, Jalaj}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {9762--9783}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/cyffers24a/cyffers24a.pdf}, url = {https://proceedings.mlr.press/v235/cyffers24a.html}, abstract = {The popularity of federated learning comes from the possibility of better scalability and the ability for participants to keep control of their data, improving data security and sovereignty. Unfortunately, sharing model updates also creates a new privacy attack surface. In this work, we characterize the privacy guarantees of decentralized learning with random walk algorithms, where a model is updated by traveling from one node to another along the edges of a communication graph. Using a recent variant of differential privacy tailored to the study of decentralized algorithms, namely Pairwise Network Differential Privacy, we derive closed-form expressions for the privacy loss between each pair of nodes where the impact of the communication topology is captured by graph theoretic quantities. Our results further reveal that random walk algorithms tends to yield better privacy guarantees than gossip algorithms for nodes close from each other. We supplement our theoretical results with empirical evaluation on synthetic and real-world graphs and datasets.} }
Endnote
%0 Conference Paper %T Differentially Private Decentralized Learning with Random Walks %A Edwige Cyffers %A Aurélien Bellet %A Jalaj Upadhyay %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-cyffers24a %I PMLR %P 9762--9783 %U https://proceedings.mlr.press/v235/cyffers24a.html %V 235 %X The popularity of federated learning comes from the possibility of better scalability and the ability for participants to keep control of their data, improving data security and sovereignty. Unfortunately, sharing model updates also creates a new privacy attack surface. In this work, we characterize the privacy guarantees of decentralized learning with random walk algorithms, where a model is updated by traveling from one node to another along the edges of a communication graph. Using a recent variant of differential privacy tailored to the study of decentralized algorithms, namely Pairwise Network Differential Privacy, we derive closed-form expressions for the privacy loss between each pair of nodes where the impact of the communication topology is captured by graph theoretic quantities. Our results further reveal that random walk algorithms tends to yield better privacy guarantees than gossip algorithms for nodes close from each other. We supplement our theoretical results with empirical evaluation on synthetic and real-world graphs and datasets.
APA
Cyffers, E., Bellet, A. & Upadhyay, J.. (2024). Differentially Private Decentralized Learning with Random Walks. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:9762-9783 Available from https://proceedings.mlr.press/v235/cyffers24a.html.

Related Material