FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning

Elnur Gasanov, Ahmed Khaled, Samuel Horváth, Peter Richtarik
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:11374-11421, 2022.

Abstract

Federated Learning (FL) is an increasingly popular machine learning paradigm in which multiple nodes try to collaboratively learn under privacy, communication and multiple heterogeneity constraints. A persistent problem in federated learning is that it is not clear what the optimization objective should be: the standard average risk minimization of supervised learning is inadequate in handling several major constraints specific to federated learning, such as communication adaptivity and personalization control. We identify several key desiderata in frameworks for federated learning and introduce a new framework, FLIX, that takes into account the unique challenges brought by federated learning. FLIX has a standard finite-sum form, which enables practitioners to tap into the immense wealth of existing (potentially non-local) methods for distributed optimization. Through a smart initialization that does not require any communication, FLIX does not require the use of local steps but is still provably capable of performing dissimilarity regularization on par with local methods. We give several algorithms for solving the FLIX formulation efficiently under communication constraints. Finally, we corroborate our theoretical results with extensive experimentation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-gasanov22a, title = { FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning }, author = {Gasanov, Elnur and Khaled, Ahmed and Horv\'ath, Samuel and Richtarik, Peter}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {11374--11421}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/gasanov22a/gasanov22a.pdf}, url = {https://proceedings.mlr.press/v151/gasanov22a.html}, abstract = { Federated Learning (FL) is an increasingly popular machine learning paradigm in which multiple nodes try to collaboratively learn under privacy, communication and multiple heterogeneity constraints. A persistent problem in federated learning is that it is not clear what the optimization objective should be: the standard average risk minimization of supervised learning is inadequate in handling several major constraints specific to federated learning, such as communication adaptivity and personalization control. We identify several key desiderata in frameworks for federated learning and introduce a new framework, FLIX, that takes into account the unique challenges brought by federated learning. FLIX has a standard finite-sum form, which enables practitioners to tap into the immense wealth of existing (potentially non-local) methods for distributed optimization. Through a smart initialization that does not require any communication, FLIX does not require the use of local steps but is still provably capable of performing dissimilarity regularization on par with local methods. We give several algorithms for solving the FLIX formulation efficiently under communication constraints. Finally, we corroborate our theoretical results with extensive experimentation. } }
Endnote
%0 Conference Paper %T FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning %A Elnur Gasanov %A Ahmed Khaled %A Samuel Horváth %A Peter Richtarik %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-gasanov22a %I PMLR %P 11374--11421 %U https://proceedings.mlr.press/v151/gasanov22a.html %V 151 %X Federated Learning (FL) is an increasingly popular machine learning paradigm in which multiple nodes try to collaboratively learn under privacy, communication and multiple heterogeneity constraints. A persistent problem in federated learning is that it is not clear what the optimization objective should be: the standard average risk minimization of supervised learning is inadequate in handling several major constraints specific to federated learning, such as communication adaptivity and personalization control. We identify several key desiderata in frameworks for federated learning and introduce a new framework, FLIX, that takes into account the unique challenges brought by federated learning. FLIX has a standard finite-sum form, which enables practitioners to tap into the immense wealth of existing (potentially non-local) methods for distributed optimization. Through a smart initialization that does not require any communication, FLIX does not require the use of local steps but is still provably capable of performing dissimilarity regularization on par with local methods. We give several algorithms for solving the FLIX formulation efficiently under communication constraints. Finally, we corroborate our theoretical results with extensive experimentation.
APA
Gasanov, E., Khaled, A., Horváth, S. & Richtarik, P.. (2022). FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:11374-11421 Available from https://proceedings.mlr.press/v151/gasanov22a.html.

Related Material