[edit]
Federated Learning Algorithm based on Gaussi-an Local Differential Noise
Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, PMLR 245:325-339, 2024.
Abstract
In differential privacy-based federated learning, the data of different clients are of-ten independently and identically distributed. During model training, each client’s data will optimize and converge towards its own optimal direction, causing a client drift phenomenon, resulting in a decrease in accuracy and making it difficult to obtain the optimal global model. To address this issue, a federated learning al-gorithm based on local differential privacy is proposed. Each client is assigned its own control variable ci to control the model update direction, and a global control variable c is set on the server side. The SCAFFOLD algorithm is used to aggregate all client model parameters and control variables. During model training, a correction term c-ci is added when updating parameters on the client side, and the model training bias is adjusted according to the global control variable obtained from the server side in the previous round, thereby controlling the model’s iterative direction towards the global optimum. Experimental results on the CIFAR-10 datasets demonstrated the effectiveness of the new algorithm.