Constrained differentially private federated learning for low-bandwidth devices

Raouf Kerkouche, Gergely Ács, Claude Castelluccia, Pierre Genevès
Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, PMLR 161:1756-1765, 2021.

Abstract

Federated learning becomes a prominent approach when different entities want to learn collaboratively a common model without sharing their training data. %Compared to traditional machine learning, it does not require to collect and centralize all data before training a common model. However, Federated learning has two main drawbacks. First, it is quite bandwidth inefficient as it involves a lot of message exchanges between the aggregating server and the participating entities. This bandwidth and corresponding processing costs could be prohibitive if the participating entities are, for example, mobile devices. Furthermore, although federated learning improves privacy by not sharing data, recent attacks have shown that it still leaks information about the training data. This paper presents a novel privacy-preserving federated learning scheme. The proposed scheme provides theoretical privacy guarantees, as it is based on Differential Privacy. Furthermore, it optimizes the model accuracy by constraining the model learning phase on few selected weights. Finally, as shown experimentally, it reduces the upstream and downstream bandwidth by up to 99.9% compared to standard federated learning, making it practical for mobile systems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v161-kerkouche21a, title = {Constrained differentially private federated learning for low-bandwidth devices}, author = {Kerkouche, Raouf and \'Acs, Gergely and Castelluccia, Claude and Genev\`es, Pierre}, booktitle = {Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence}, pages = {1756--1765}, year = {2021}, editor = {de Campos, Cassio and Maathuis, Marloes H.}, volume = {161}, series = {Proceedings of Machine Learning Research}, month = {27--30 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v161/kerkouche21a/kerkouche21a.pdf}, url = {https://proceedings.mlr.press/v161/kerkouche21a.html}, abstract = {Federated learning becomes a prominent approach when different entities want to learn collaboratively a common model without sharing their training data. %Compared to traditional machine learning, it does not require to collect and centralize all data before training a common model. However, Federated learning has two main drawbacks. First, it is quite bandwidth inefficient as it involves a lot of message exchanges between the aggregating server and the participating entities. This bandwidth and corresponding processing costs could be prohibitive if the participating entities are, for example, mobile devices. Furthermore, although federated learning improves privacy by not sharing data, recent attacks have shown that it still leaks information about the training data. This paper presents a novel privacy-preserving federated learning scheme. The proposed scheme provides theoretical privacy guarantees, as it is based on Differential Privacy. Furthermore, it optimizes the model accuracy by constraining the model learning phase on few selected weights. Finally, as shown experimentally, it reduces the upstream and downstream bandwidth by up to 99.9% compared to standard federated learning, making it practical for mobile systems.} }
Endnote
%0 Conference Paper %T Constrained differentially private federated learning for low-bandwidth devices %A Raouf Kerkouche %A Gergely Ács %A Claude Castelluccia %A Pierre Genevès %B Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2021 %E Cassio de Campos %E Marloes H. Maathuis %F pmlr-v161-kerkouche21a %I PMLR %P 1756--1765 %U https://proceedings.mlr.press/v161/kerkouche21a.html %V 161 %X Federated learning becomes a prominent approach when different entities want to learn collaboratively a common model without sharing their training data. %Compared to traditional machine learning, it does not require to collect and centralize all data before training a common model. However, Federated learning has two main drawbacks. First, it is quite bandwidth inefficient as it involves a lot of message exchanges between the aggregating server and the participating entities. This bandwidth and corresponding processing costs could be prohibitive if the participating entities are, for example, mobile devices. Furthermore, although federated learning improves privacy by not sharing data, recent attacks have shown that it still leaks information about the training data. This paper presents a novel privacy-preserving federated learning scheme. The proposed scheme provides theoretical privacy guarantees, as it is based on Differential Privacy. Furthermore, it optimizes the model accuracy by constraining the model learning phase on few selected weights. Finally, as shown experimentally, it reduces the upstream and downstream bandwidth by up to 99.9% compared to standard federated learning, making it practical for mobile systems.
APA
Kerkouche, R., Ács, G., Castelluccia, C. & Genevès, P.. (2021). Constrained differentially private federated learning for low-bandwidth devices. Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 161:1756-1765 Available from https://proceedings.mlr.press/v161/kerkouche21a.html.

Related Material