FedBAT: Communication-Efficient Federated Learning via Learnable Binarization

Shiwei Li, Wenchao Xu, Haozhao Wang, Xing Tang, Yining Qi, Shijie Xu, Weihong Luo, Yuhua Li, Xiuqiang He, Ruixuan Li
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:29074-29095, 2024.

Abstract

Federated learning is a promising distributed machine learning paradigm that can effectively exploit large-scale data without exposing users’ privacy. However, it may incur significant communication overhead, thereby potentially impairing the training efficiency. To address this challenge, numerous studies suggest binarizing the model updates. Nonetheless, traditional methods usually binarize model updates in a post-training manner, resulting in significant approximation errors and consequent degradation in model accuracy. To this end, we propose Federated Binarization-Aware Training (FedBAT), a novel framework that directly learns binary model updates during the local training process, thus inherently reducing the approximation errors. FedBAT incorporates an innovative binarization operator, along with meticulously designed derivatives to facilitate efficient learning. In addition, we establish theoretical guarantees regarding the convergence of FedBAT. Extensive experiments are conducted on four popular datasets. The results show that FedBAT significantly accelerates the convergence and exceeds the accuracy of baselines by up to 9%, even surpassing that of FedAvg in some cases.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-li24ca, title = {{F}ed{BAT}: Communication-Efficient Federated Learning via Learnable Binarization}, author = {Li, Shiwei and Xu, Wenchao and Wang, Haozhao and Tang, Xing and Qi, Yining and Xu, Shijie and Luo, Weihong and Li, Yuhua and He, Xiuqiang and Li, Ruixuan}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {29074--29095}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/li24ca/li24ca.pdf}, url = {https://proceedings.mlr.press/v235/li24ca.html}, abstract = {Federated learning is a promising distributed machine learning paradigm that can effectively exploit large-scale data without exposing users’ privacy. However, it may incur significant communication overhead, thereby potentially impairing the training efficiency. To address this challenge, numerous studies suggest binarizing the model updates. Nonetheless, traditional methods usually binarize model updates in a post-training manner, resulting in significant approximation errors and consequent degradation in model accuracy. To this end, we propose Federated Binarization-Aware Training (FedBAT), a novel framework that directly learns binary model updates during the local training process, thus inherently reducing the approximation errors. FedBAT incorporates an innovative binarization operator, along with meticulously designed derivatives to facilitate efficient learning. In addition, we establish theoretical guarantees regarding the convergence of FedBAT. Extensive experiments are conducted on four popular datasets. The results show that FedBAT significantly accelerates the convergence and exceeds the accuracy of baselines by up to 9%, even surpassing that of FedAvg in some cases.} }
Endnote
%0 Conference Paper %T FedBAT: Communication-Efficient Federated Learning via Learnable Binarization %A Shiwei Li %A Wenchao Xu %A Haozhao Wang %A Xing Tang %A Yining Qi %A Shijie Xu %A Weihong Luo %A Yuhua Li %A Xiuqiang He %A Ruixuan Li %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-li24ca %I PMLR %P 29074--29095 %U https://proceedings.mlr.press/v235/li24ca.html %V 235 %X Federated learning is a promising distributed machine learning paradigm that can effectively exploit large-scale data without exposing users’ privacy. However, it may incur significant communication overhead, thereby potentially impairing the training efficiency. To address this challenge, numerous studies suggest binarizing the model updates. Nonetheless, traditional methods usually binarize model updates in a post-training manner, resulting in significant approximation errors and consequent degradation in model accuracy. To this end, we propose Federated Binarization-Aware Training (FedBAT), a novel framework that directly learns binary model updates during the local training process, thus inherently reducing the approximation errors. FedBAT incorporates an innovative binarization operator, along with meticulously designed derivatives to facilitate efficient learning. In addition, we establish theoretical guarantees regarding the convergence of FedBAT. Extensive experiments are conducted on four popular datasets. The results show that FedBAT significantly accelerates the convergence and exceeds the accuracy of baselines by up to 9%, even surpassing that of FedAvg in some cases.
APA
Li, S., Xu, W., Wang, H., Tang, X., Qi, Y., Xu, S., Luo, W., Li, Y., He, X. & Li, R.. (2024). FedBAT: Communication-Efficient Federated Learning via Learnable Binarization. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:29074-29095 Available from https://proceedings.mlr.press/v235/li24ca.html.

Related Material