FedBoost: A Communication-Efficient Algorithm for Federated Learning

Jenny Hamer, Mehryar Mohri, Ananda Theertha Suresh
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:3973-3983, 2020.

Abstract

Communication cost is often a bottleneck in federated learning and other client-based distributed learning scenarios. To overcome this, several gradient compression and model compression algorithms have been proposed. In this work, we propose an alternative approach whereby an ensemble of pre-trained base predictors is trained via federated learning. This method allows for training a model which may otherwise surpass the communication bandwidth and storage capacity of the clients to be learned with on-device data through federated learning. Motivated by language modeling, we prove the optimality of ensemble methods for density estimation for standard empirical risk minimization and agnostic risk minimization. We provide communication-efficient ensemble algorithms for federated learning, where per-round communication cost is independent of the size of the ensemble. Furthermore, unlike works on gradient compression, our proposed approach reduces the communication cost of both server-to-client and client-to-server communication.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-hamer20a, title = {{F}ed{B}oost: A Communication-Efficient Algorithm for Federated Learning}, author = {Hamer, Jenny and Mohri, Mehryar and Suresh, Ananda Theertha}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {3973--3983}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/hamer20a/hamer20a.pdf}, url = {https://proceedings.mlr.press/v119/hamer20a.html}, abstract = {Communication cost is often a bottleneck in federated learning and other client-based distributed learning scenarios. To overcome this, several gradient compression and model compression algorithms have been proposed. In this work, we propose an alternative approach whereby an ensemble of pre-trained base predictors is trained via federated learning. This method allows for training a model which may otherwise surpass the communication bandwidth and storage capacity of the clients to be learned with on-device data through federated learning. Motivated by language modeling, we prove the optimality of ensemble methods for density estimation for standard empirical risk minimization and agnostic risk minimization. We provide communication-efficient ensemble algorithms for federated learning, where per-round communication cost is independent of the size of the ensemble. Furthermore, unlike works on gradient compression, our proposed approach reduces the communication cost of both server-to-client and client-to-server communication.} }
Endnote
%0 Conference Paper %T FedBoost: A Communication-Efficient Algorithm for Federated Learning %A Jenny Hamer %A Mehryar Mohri %A Ananda Theertha Suresh %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-hamer20a %I PMLR %P 3973--3983 %U https://proceedings.mlr.press/v119/hamer20a.html %V 119 %X Communication cost is often a bottleneck in federated learning and other client-based distributed learning scenarios. To overcome this, several gradient compression and model compression algorithms have been proposed. In this work, we propose an alternative approach whereby an ensemble of pre-trained base predictors is trained via federated learning. This method allows for training a model which may otherwise surpass the communication bandwidth and storage capacity of the clients to be learned with on-device data through federated learning. Motivated by language modeling, we prove the optimality of ensemble methods for density estimation for standard empirical risk minimization and agnostic risk minimization. We provide communication-efficient ensemble algorithms for federated learning, where per-round communication cost is independent of the size of the ensemble. Furthermore, unlike works on gradient compression, our proposed approach reduces the communication cost of both server-to-client and client-to-server communication.
APA
Hamer, J., Mohri, M. & Suresh, A.T.. (2020). FedBoost: A Communication-Efficient Algorithm for Federated Learning. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:3973-3983 Available from https://proceedings.mlr.press/v119/hamer20a.html.

Related Material