Decentralized Federated Learning Algorithm Based on Federated Groups and Secure Multiparty Computation

Wu Fan, Gao Maoting
Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, PMLR 245:340-348, 2024.

Abstract

To solve the problem that the centralized federal learning based on privacy protection relies on trusted central servers, has low resistance to malicious attacks, and is prone to privacy leakage, this paper proposes a decentralized federated learning algorithm based on federated groups and secure multiparty computation. By establishing a federated group mechanism based on model relevance, each client has its own federated group, and model parameters are only transmitted among fed- erated group members, members outside the group are unable to access parameter information. Secret owners utilize secret sharing algorithms to split their model parameters into several secret shares, which are then transmitted to federated group members through secure channels. Federated group members then aggregate all transmitted secret shares by weighted averaging, and the secret owner receives the aggregated secret shares passed back from all federated group members, and then uses the secret recovery algorithms to recover secret, and obtains the updated parameter model. In the federated group, while a member becomes a Byzantine node, it is removed from the federated group, and another client is selected to join the group based on model relevance. So, each client participating in federated learning serves as both a data node and a computing node, federated learning eliminates reliance on servers and achieves decentralization. The good privacy performance of the proposed algorithm model is theoretically analyzed, and experiments on the FedML platform demonstrated that the algorithm has stronger resistance to attacks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v245-fan24b, title = {Decentralized Federated Learning Algorithm Based on Federated Groups and Secure Multiparty Computation}, author = {Fan, Wu and Maoting, Gao}, booktitle = {Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing}, pages = {340--348}, year = {2024}, editor = {Nianyin, Zeng and Pachori, Ram Bilas}, volume = {245}, series = {Proceedings of Machine Learning Research}, month = {26--28 Apr}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v245/main/assets/fan24b/fan24b.pdf}, url = {https://proceedings.mlr.press/v245/fan24b.html}, abstract = {To solve the problem that the centralized federal learning based on privacy protection relies on trusted central servers, has low resistance to malicious attacks, and is prone to privacy leakage, this paper proposes a decentralized federated learning algorithm based on federated groups and secure multiparty computation. By establishing a federated group mechanism based on model relevance, each client has its own federated group, and model parameters are only transmitted among fed- erated group members, members outside the group are unable to access parameter information. Secret owners utilize secret sharing algorithms to split their model parameters into several secret shares, which are then transmitted to federated group members through secure channels. Federated group members then aggregate all transmitted secret shares by weighted averaging, and the secret owner receives the aggregated secret shares passed back from all federated group members, and then uses the secret recovery algorithms to recover secret, and obtains the updated parameter model. In the federated group, while a member becomes a Byzantine node, it is removed from the federated group, and another client is selected to join the group based on model relevance. So, each client participating in federated learning serves as both a data node and a computing node, federated learning eliminates reliance on servers and achieves decentralization. The good privacy performance of the proposed algorithm model is theoretically analyzed, and experiments on the FedML platform demonstrated that the algorithm has stronger resistance to attacks.} }
Endnote
%0 Conference Paper %T Decentralized Federated Learning Algorithm Based on Federated Groups and Secure Multiparty Computation %A Wu Fan %A Gao Maoting %B Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing %C Proceedings of Machine Learning Research %D 2024 %E Zeng Nianyin %E Ram Bilas Pachori %F pmlr-v245-fan24b %I PMLR %P 340--348 %U https://proceedings.mlr.press/v245/fan24b.html %V 245 %X To solve the problem that the centralized federal learning based on privacy protection relies on trusted central servers, has low resistance to malicious attacks, and is prone to privacy leakage, this paper proposes a decentralized federated learning algorithm based on federated groups and secure multiparty computation. By establishing a federated group mechanism based on model relevance, each client has its own federated group, and model parameters are only transmitted among fed- erated group members, members outside the group are unable to access parameter information. Secret owners utilize secret sharing algorithms to split their model parameters into several secret shares, which are then transmitted to federated group members through secure channels. Federated group members then aggregate all transmitted secret shares by weighted averaging, and the secret owner receives the aggregated secret shares passed back from all federated group members, and then uses the secret recovery algorithms to recover secret, and obtains the updated parameter model. In the federated group, while a member becomes a Byzantine node, it is removed from the federated group, and another client is selected to join the group based on model relevance. So, each client participating in federated learning serves as both a data node and a computing node, federated learning eliminates reliance on servers and achieves decentralization. The good privacy performance of the proposed algorithm model is theoretically analyzed, and experiments on the FedML platform demonstrated that the algorithm has stronger resistance to attacks.
APA
Fan, W. & Maoting, G.. (2024). Decentralized Federated Learning Algorithm Based on Federated Groups and Secure Multiparty Computation. Proceedings of 2024 International Conference on Machine Learning and Intelligent Computing, in Proceedings of Machine Learning Research 245:340-348 Available from https://proceedings.mlr.press/v245/fan24b.html.

Related Material