Federated Learning Algorithm based on Gaussi-an Local Differential Noise

Wu Fan, Gao Maoting
Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, PMLR 245:325-339, 2024.

Abstract

In differential privacy-based federated learning, the data of different clients are of-ten independently and identically distributed. During model training, each client’s data will optimize and converge towards its own optimal direction, causing a client drift phenomenon, resulting in a decrease in accuracy and making it difficult to obtain the optimal global model. To address this issue, a federated learning al-gorithm based on local differential privacy is proposed. Each client is assigned its own control variable ci to control the model update direction, and a global control variable c is set on the server side. The SCAFFOLD algorithm is used to aggregate all client model parameters and control variables. During model training, a correction term c-ci is added when updating parameters on the client side, and the model training bias is adjusted according to the global control variable obtained from the server side in the previous round, thereby controlling the model’s iterative direction towards the global optimum. Experimental results on the CIFAR-10 datasets demonstrated the effectiveness of the new algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v245-fan24a, title = {Federated Learning Algorithm based on Gaussi-an Local Differential Noise}, author = {Fan, Wu and Maoting, Gao}, booktitle = {Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing}, pages = {325--339}, year = {2024}, editor = {Nianyin, Zeng and Pachori, Ram Bilas}, volume = {245}, series = {Proceedings of Machine Learning Research}, month = {26--28 Apr}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v245/main/assets/fan24a/fan24a.pdf}, url = {https://proceedings.mlr.press/v245/fan24a.html}, abstract = {In differential privacy-based federated learning, the data of different clients are of-ten independently and identically distributed. During model training, each client’s data will optimize and converge towards its own optimal direction, causing a client drift phenomenon, resulting in a decrease in accuracy and making it difficult to obtain the optimal global model. To address this issue, a federated learning al-gorithm based on local differential privacy is proposed. Each client is assigned its own control variable ci to control the model update direction, and a global control variable c is set on the server side. The SCAFFOLD algorithm is used to aggregate all client model parameters and control variables. During model training, a correction term c-ci is added when updating parameters on the client side, and the model training bias is adjusted according to the global control variable obtained from the server side in the previous round, thereby controlling the model’s iterative direction towards the global optimum. Experimental results on the CIFAR-10 datasets demonstrated the effectiveness of the new algorithm.} }
Endnote
%0 Conference Paper %T Federated Learning Algorithm based on Gaussi-an Local Differential Noise %A Wu Fan %A Gao Maoting %B Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing %C Proceedings of Machine Learning Research %D 2024 %E Zeng Nianyin %E Ram Bilas Pachori %F pmlr-v245-fan24a %I PMLR %P 325--339 %U https://proceedings.mlr.press/v245/fan24a.html %V 245 %X In differential privacy-based federated learning, the data of different clients are of-ten independently and identically distributed. During model training, each client’s data will optimize and converge towards its own optimal direction, causing a client drift phenomenon, resulting in a decrease in accuracy and making it difficult to obtain the optimal global model. To address this issue, a federated learning al-gorithm based on local differential privacy is proposed. Each client is assigned its own control variable ci to control the model update direction, and a global control variable c is set on the server side. The SCAFFOLD algorithm is used to aggregate all client model parameters and control variables. During model training, a correction term c-ci is added when updating parameters on the client side, and the model training bias is adjusted according to the global control variable obtained from the server side in the previous round, thereby controlling the model’s iterative direction towards the global optimum. Experimental results on the CIFAR-10 datasets demonstrated the effectiveness of the new algorithm.
APA
Fan, W. & Maoting, G.. (2024). Federated Learning Algorithm based on Gaussi-an Local Differential Noise. Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, in Proceedings of Machine Learning Research 245:325-339 Available from https://proceedings.mlr.press/v245/fan24a.html.

Related Material