[edit]
Decentralized Federated Learning Algorithm Based on Federated Groups and Secure Multiparty Computation
Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, PMLR 245:340-348, 2024.
Abstract
To solve the problem that the centralized federal learning based on privacy protection relies on trusted central servers, has low resistance to malicious attacks, and is prone to privacy leakage, this paper proposes a decentralized federated learning algorithm based on federated groups and secure multiparty computation. By establishing a federated group mechanism based on model relevance, each client has its own federated group, and model parameters are only transmitted among fed- erated group members, members outside the group are unable to access parameter information. Secret owners utilize secret sharing algorithms to split their model parameters into several secret shares, which are then transmitted to federated group members through secure channels. Federated group members then aggregate all transmitted secret shares by weighted averaging, and the secret owner receives the aggregated secret shares passed back from all federated group members, and then uses the secret recovery algorithms to recover secret, and obtains the updated parameter model. In the federated group, while a member becomes a Byzantine node, it is removed from the federated group, and another client is selected to join the group based on model relevance. So, each client participating in federated learning serves as both a data node and a computing node, federated learning eliminates reliance on servers and achieves decentralization. The good privacy performance of the proposed algorithm model is theoretically analyzed, and experiments on the FedML platform demonstrated that the algorithm has stronger resistance to attacks.