On the Power of Adaptive Weighted Aggregation in Heterogeneous Federated Learning and Beyond

Dun Zeng, Zenglin Xu, SHIYU LIU, Yu Pan, Qifan Wang, Xiaoying Tang
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1081-1089, 2025.

Abstract

Federated averaging (FedAvg) is the most fundamental algorithm in Federated learning (FL). Previous theoretical results assert that FedAvg convergence and generalization degenerate under heterogeneous clients. However, recent empirical results show that FedAvg can perform well in many real-world heterogeneous tasks. These results reveal an inconsistency between FL theory and practice that is not fully explained. In this paper, we show that common heterogeneity measures contribute to this inconsistency based on rigorous convergence analysis. Furthermore, we introduce a new measure \textit{client consensus dynamics} and prove that \textit{FedAvg can effectively handle client heterogeneity when an appropriate aggregation strategy is used}. Building on this theoretical insight, we present a simple and effective FedAvg variant termed FedAWARE. Extensive experiments on three datasets and two modern neural network architectures demonstrate that FedAWARE ensures faster convergence and better generalization in heterogeneous client settings. Moreover, our results show that FedAWARE can significantly enhance the generalization performance of advanced FL algorithms when used as a plug-in module.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-zeng25b, title = {On the Power of Adaptive Weighted Aggregation in Heterogeneous Federated Learning and Beyond}, author = {Zeng, Dun and Xu, Zenglin and LIU, SHIYU and Pan, Yu and Wang, Qifan and Tang, Xiaoying}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1081--1089}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/zeng25b/zeng25b.pdf}, url = {https://proceedings.mlr.press/v258/zeng25b.html}, abstract = {Federated averaging (FedAvg) is the most fundamental algorithm in Federated learning (FL). Previous theoretical results assert that FedAvg convergence and generalization degenerate under heterogeneous clients. However, recent empirical results show that FedAvg can perform well in many real-world heterogeneous tasks. These results reveal an inconsistency between FL theory and practice that is not fully explained. In this paper, we show that common heterogeneity measures contribute to this inconsistency based on rigorous convergence analysis. Furthermore, we introduce a new measure \textit{client consensus dynamics} and prove that \textit{FedAvg can effectively handle client heterogeneity when an appropriate aggregation strategy is used}. Building on this theoretical insight, we present a simple and effective FedAvg variant termed FedAWARE. Extensive experiments on three datasets and two modern neural network architectures demonstrate that FedAWARE ensures faster convergence and better generalization in heterogeneous client settings. Moreover, our results show that FedAWARE can significantly enhance the generalization performance of advanced FL algorithms when used as a plug-in module.} }
Endnote
%0 Conference Paper %T On the Power of Adaptive Weighted Aggregation in Heterogeneous Federated Learning and Beyond %A Dun Zeng %A Zenglin Xu %A SHIYU LIU %A Yu Pan %A Qifan Wang %A Xiaoying Tang %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-zeng25b %I PMLR %P 1081--1089 %U https://proceedings.mlr.press/v258/zeng25b.html %V 258 %X Federated averaging (FedAvg) is the most fundamental algorithm in Federated learning (FL). Previous theoretical results assert that FedAvg convergence and generalization degenerate under heterogeneous clients. However, recent empirical results show that FedAvg can perform well in many real-world heterogeneous tasks. These results reveal an inconsistency between FL theory and practice that is not fully explained. In this paper, we show that common heterogeneity measures contribute to this inconsistency based on rigorous convergence analysis. Furthermore, we introduce a new measure \textit{client consensus dynamics} and prove that \textit{FedAvg can effectively handle client heterogeneity when an appropriate aggregation strategy is used}. Building on this theoretical insight, we present a simple and effective FedAvg variant termed FedAWARE. Extensive experiments on three datasets and two modern neural network architectures demonstrate that FedAWARE ensures faster convergence and better generalization in heterogeneous client settings. Moreover, our results show that FedAWARE can significantly enhance the generalization performance of advanced FL algorithms when used as a plug-in module.
APA
Zeng, D., Xu, Z., LIU, S., Pan, Y., Wang, Q. & Tang, X.. (2025). On the Power of Adaptive Weighted Aggregation in Heterogeneous Federated Learning and Beyond. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1081-1089 Available from https://proceedings.mlr.press/v258/zeng25b.html.

Related Material