Ranking-based Client Imitation Selection for Efficient Federated Learning

Chunlin Tian, Zhan Shi, Xinpeng Qin, Li Li, Cheng-Zhong Xu
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:48211-48225, 2024.

Abstract

Federated Learning (FL) enables multiple devices to collaboratively train a shared model while ensuring data privacy. The selection of participating devices in each training round critically affects both the model performance and training efficiency, especially given the vast heterogeneity in training capabilities and data distribution across devices. To deal with these challenges, we introduce a novel device selection solution called FedRank, which is based on an end-to-end, ranking-based model that is pre-trained by imitation learning against state-of-the-art analytical approaches. It not only considers data and system heterogeneity at runtime but also adaptively and efficiently chooses the most suitable clients for model training. Specifically, FedRank views client selection in FL as a ranking problem and employs a pairwise training strategy for the smart selection process. Additionally, an imitation learning-based approach is designed to counteract the cold-start issues often seen in state-of-the-art learning-based approaches. Experimental results reveal that FedRank boosts model accuracy by 5.2% to 56.9%, accelerates the training convergence up to $2.01 \times$ and saves the energy consumption up to 40.1%.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-tian24d, title = {Ranking-based Client Imitation Selection for Efficient Federated Learning}, author = {Tian, Chunlin and Shi, Zhan and Qin, Xinpeng and Li, Li and Xu, Cheng-Zhong}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {48211--48225}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/tian24d/tian24d.pdf}, url = {https://proceedings.mlr.press/v235/tian24d.html}, abstract = {Federated Learning (FL) enables multiple devices to collaboratively train a shared model while ensuring data privacy. The selection of participating devices in each training round critically affects both the model performance and training efficiency, especially given the vast heterogeneity in training capabilities and data distribution across devices. To deal with these challenges, we introduce a novel device selection solution called FedRank, which is based on an end-to-end, ranking-based model that is pre-trained by imitation learning against state-of-the-art analytical approaches. It not only considers data and system heterogeneity at runtime but also adaptively and efficiently chooses the most suitable clients for model training. Specifically, FedRank views client selection in FL as a ranking problem and employs a pairwise training strategy for the smart selection process. Additionally, an imitation learning-based approach is designed to counteract the cold-start issues often seen in state-of-the-art learning-based approaches. Experimental results reveal that FedRank boosts model accuracy by 5.2% to 56.9%, accelerates the training convergence up to $2.01 \times$ and saves the energy consumption up to 40.1%.} }
Endnote
%0 Conference Paper %T Ranking-based Client Imitation Selection for Efficient Federated Learning %A Chunlin Tian %A Zhan Shi %A Xinpeng Qin %A Li Li %A Cheng-Zhong Xu %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-tian24d %I PMLR %P 48211--48225 %U https://proceedings.mlr.press/v235/tian24d.html %V 235 %X Federated Learning (FL) enables multiple devices to collaboratively train a shared model while ensuring data privacy. The selection of participating devices in each training round critically affects both the model performance and training efficiency, especially given the vast heterogeneity in training capabilities and data distribution across devices. To deal with these challenges, we introduce a novel device selection solution called FedRank, which is based on an end-to-end, ranking-based model that is pre-trained by imitation learning against state-of-the-art analytical approaches. It not only considers data and system heterogeneity at runtime but also adaptively and efficiently chooses the most suitable clients for model training. Specifically, FedRank views client selection in FL as a ranking problem and employs a pairwise training strategy for the smart selection process. Additionally, an imitation learning-based approach is designed to counteract the cold-start issues often seen in state-of-the-art learning-based approaches. Experimental results reveal that FedRank boosts model accuracy by 5.2% to 56.9%, accelerates the training convergence up to $2.01 \times$ and saves the energy consumption up to 40.1%.
APA
Tian, C., Shi, Z., Qin, X., Li, L. & Xu, C.. (2024). Ranking-based Client Imitation Selection for Efficient Federated Learning. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:48211-48225 Available from https://proceedings.mlr.press/v235/tian24d.html.

Related Material