FedPerf: A Practitioners’ Guide to Performance of Federated Learning Algorithms

Ajinkya Mulay, Baye Gaspard, Rakshit Naidu, Santiago Gonzalez-Toral, Vineeth S, Tushar Semwal, Ayush Manish Agrawal
NeurIPS 2020 Workshop on Pre-registration in Machine Learning, PMLR 148:302-324, 2021.

Abstract

Federated Learning (FL) enables edge devices to collaboratively train a global model without sharing their local data. This decentralized and distributed approach improves user privacy, security, and trust. Different variants of FL algorithms have presented promising results on both IID and skewed Non-IID data. However, the performance of FL algorithms is found to be sensitive to the FL system parameters and hyperparameters of the used model. In practice, tuning the right set of parameter settings for an FL algorithm is an expensive task. In this preregister paper, we propose an empirical investigation on four prominent FL algorithms to discover the relation between the FL System Parameters (FLSPs) and their performances. The FLSPs add extra complexity to FL algorithms over a traditional ML system. We hypothesize that choosing the best FL algorithm for the given FLSP is not a trivial problem. Further, we endeavor to formulate a systematic method that could aid the practitioners in selecting a suitable algorithm given the FLSPs. The code for all the experiments is available here: https://github.com/tushar-semwal/fedperf.

Cite this Paper


BibTeX
@InProceedings{pmlr-v148-mulay21a, title = {FedPerf: A Practitioners’ Guide to Performance of Federated Learning Algorithms}, author = {Mulay, Ajinkya and Gaspard, Baye and Naidu, Rakshit and Gonzalez-Toral, Santiago and S, Vineeth and Semwal, Tushar and Manish Agrawal, Ayush}, booktitle = {NeurIPS 2020 Workshop on Pre-registration in Machine Learning}, pages = {302--324}, year = {2021}, editor = {Bertinetto, Luca and Henriques, João F. and Albanie, Samuel and Paganini, Michela and Varol, Gül}, volume = {148}, series = {Proceedings of Machine Learning Research}, month = {11 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v148/mulay21a/mulay21a.pdf}, url = {https://proceedings.mlr.press/v148/mulay21a.html}, abstract = {Federated Learning (FL) enables edge devices to collaboratively train a global model without sharing their local data. This decentralized and distributed approach improves user privacy, security, and trust. Different variants of FL algorithms have presented promising results on both IID and skewed Non-IID data. However, the performance of FL algorithms is found to be sensitive to the FL system parameters and hyperparameters of the used model. In practice, tuning the right set of parameter settings for an FL algorithm is an expensive task. In this preregister paper, we propose an empirical investigation on four prominent FL algorithms to discover the relation between the FL System Parameters (FLSPs) and their performances. The FLSPs add extra complexity to FL algorithms over a traditional ML system. We hypothesize that choosing the best FL algorithm for the given FLSP is not a trivial problem. Further, we endeavor to formulate a systematic method that could aid the practitioners in selecting a suitable algorithm given the FLSPs. The code for all the experiments is available here: https://github.com/tushar-semwal/fedperf.} }
Endnote
%0 Conference Paper %T FedPerf: A Practitioners’ Guide to Performance of Federated Learning Algorithms %A Ajinkya Mulay %A Baye Gaspard %A Rakshit Naidu %A Santiago Gonzalez-Toral %A Vineeth S %A Tushar Semwal %A Ayush Manish Agrawal %B NeurIPS 2020 Workshop on Pre-registration in Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Luca Bertinetto %E João F. Henriques %E Samuel Albanie %E Michela Paganini %E Gül Varol %F pmlr-v148-mulay21a %I PMLR %P 302--324 %U https://proceedings.mlr.press/v148/mulay21a.html %V 148 %X Federated Learning (FL) enables edge devices to collaboratively train a global model without sharing their local data. This decentralized and distributed approach improves user privacy, security, and trust. Different variants of FL algorithms have presented promising results on both IID and skewed Non-IID data. However, the performance of FL algorithms is found to be sensitive to the FL system parameters and hyperparameters of the used model. In practice, tuning the right set of parameter settings for an FL algorithm is an expensive task. In this preregister paper, we propose an empirical investigation on four prominent FL algorithms to discover the relation between the FL System Parameters (FLSPs) and their performances. The FLSPs add extra complexity to FL algorithms over a traditional ML system. We hypothesize that choosing the best FL algorithm for the given FLSP is not a trivial problem. Further, we endeavor to formulate a systematic method that could aid the practitioners in selecting a suitable algorithm given the FLSPs. The code for all the experiments is available here: https://github.com/tushar-semwal/fedperf.
APA
Mulay, A., Gaspard, B., Naidu, R., Gonzalez-Toral, S., S, V., Semwal, T. & Manish Agrawal, A.. (2021). FedPerf: A Practitioners’ Guide to Performance of Federated Learning Algorithms. NeurIPS 2020 Workshop on Pre-registration in Machine Learning, in Proceedings of Machine Learning Research 148:302-324 Available from https://proceedings.mlr.press/v148/mulay21a.html.

Related Material