DoCoFL: Downlink Compression for Cross-Device Federated Learning

Ron Dorfman, Shay Vargaftik, Yaniv Ben-Itzhak, Kfir Yehuda Levy
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:8356-8388, 2023.

Abstract

Many compression techniques have been proposed to reduce the communication overhead of Federated Learning training procedures. However, these are typically designed for compressing model updates, which are expected to decay throughout training. As a result, such methods are inapplicable to downlink (i.e., from the parameter server to clients) compression in the cross-device setting, where heterogeneous clients may appear only once during training and thus must download the model parameters. Accordingly, we propose DoCoFL – a new framework for downlink compression in the cross-device setting. Importantly, DoCoFL can be seamlessly combined with many uplink compression schemes, rendering it suitable for bi-directional compression. Through extensive evaluation, we show that DoCoFL offers significant bi-directional bandwidth reduction while achieving competitive accuracy to that of a baseline without any compression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-dorfman23a, title = {{D}o{C}o{FL}: Downlink Compression for Cross-Device Federated Learning}, author = {Dorfman, Ron and Vargaftik, Shay and Ben-Itzhak, Yaniv and Levy, Kfir Yehuda}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {8356--8388}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/dorfman23a/dorfman23a.pdf}, url = {https://proceedings.mlr.press/v202/dorfman23a.html}, abstract = {Many compression techniques have been proposed to reduce the communication overhead of Federated Learning training procedures. However, these are typically designed for compressing model updates, which are expected to decay throughout training. As a result, such methods are inapplicable to downlink (i.e., from the parameter server to clients) compression in the cross-device setting, where heterogeneous clients may appear only once during training and thus must download the model parameters. Accordingly, we propose DoCoFL – a new framework for downlink compression in the cross-device setting. Importantly, DoCoFL can be seamlessly combined with many uplink compression schemes, rendering it suitable for bi-directional compression. Through extensive evaluation, we show that DoCoFL offers significant bi-directional bandwidth reduction while achieving competitive accuracy to that of a baseline without any compression.} }
Endnote
%0 Conference Paper %T DoCoFL: Downlink Compression for Cross-Device Federated Learning %A Ron Dorfman %A Shay Vargaftik %A Yaniv Ben-Itzhak %A Kfir Yehuda Levy %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-dorfman23a %I PMLR %P 8356--8388 %U https://proceedings.mlr.press/v202/dorfman23a.html %V 202 %X Many compression techniques have been proposed to reduce the communication overhead of Federated Learning training procedures. However, these are typically designed for compressing model updates, which are expected to decay throughout training. As a result, such methods are inapplicable to downlink (i.e., from the parameter server to clients) compression in the cross-device setting, where heterogeneous clients may appear only once during training and thus must download the model parameters. Accordingly, we propose DoCoFL – a new framework for downlink compression in the cross-device setting. Importantly, DoCoFL can be seamlessly combined with many uplink compression schemes, rendering it suitable for bi-directional compression. Through extensive evaluation, we show that DoCoFL offers significant bi-directional bandwidth reduction while achieving competitive accuracy to that of a baseline without any compression.
APA
Dorfman, R., Vargaftik, S., Ben-Itzhak, Y. & Levy, K.Y.. (2023). DoCoFL: Downlink Compression for Cross-Device Federated Learning. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:8356-8388 Available from https://proceedings.mlr.press/v202/dorfman23a.html.

Related Material