Hybrid Batch Normalisation: Resolving the Dilemma of Batch Normalisation in Federated Learning

Hongyao Chen, Tianyang Xu, Xiaojun Wu, Josef Kittler
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:9500-9515, 2025.

Abstract

Batch Normalisation (BN) is widely used in conventional deep neural network training to harmonise the input-output distributions for each batch of data. However, federated learning, a distributed learning paradigm, faces the challenge of dealing with non-independent and identically distributed data among the client nodes. Due to the lack of a coherent methodology for updating BN statistical parameters, standard BN degrades the federated learning performance. To this end, it is urgent to explore an alternative normalisation solution for federated learning. In this work, we resolve the dilemma of the BN layer in federated learning by developing a customised normalisation approach, Hybrid Batch Normalisation (HBN). HBN separates the update of statistical parameters (i.e., means and variances used for evaluation) from that of learnable parameters (i.e., parameters that require gradient updates), obtaining unbiased estimates of global statistical parameters in distributed scenarios. In contrast with the existing solutions, we emphasise the supportive power of global statistics for federated learning. The HBN layer introduces a learnable hybrid distribution factor, allowing each computing node to adaptively mix the statistical parameters of the current batch with the global statistics. Our HBN can serve as a powerful plugin to advance federated learning performance. It reflects promising merits across a wide range of federated learning settings, especially for small batch sizes and heterogeneous data. Code is available at https://github.com/Hongyao-Chen/HybridBN.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-chen25by, title = {Hybrid Batch Normalisation: Resolving the Dilemma of Batch Normalisation in Federated Learning}, author = {Chen, Hongyao and Xu, Tianyang and Wu, Xiaojun and Kittler, Josef}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {9500--9515}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/chen25by/chen25by.pdf}, url = {https://proceedings.mlr.press/v267/chen25by.html}, abstract = {Batch Normalisation (BN) is widely used in conventional deep neural network training to harmonise the input-output distributions for each batch of data. However, federated learning, a distributed learning paradigm, faces the challenge of dealing with non-independent and identically distributed data among the client nodes. Due to the lack of a coherent methodology for updating BN statistical parameters, standard BN degrades the federated learning performance. To this end, it is urgent to explore an alternative normalisation solution for federated learning. In this work, we resolve the dilemma of the BN layer in federated learning by developing a customised normalisation approach, Hybrid Batch Normalisation (HBN). HBN separates the update of statistical parameters (i.e., means and variances used for evaluation) from that of learnable parameters (i.e., parameters that require gradient updates), obtaining unbiased estimates of global statistical parameters in distributed scenarios. In contrast with the existing solutions, we emphasise the supportive power of global statistics for federated learning. The HBN layer introduces a learnable hybrid distribution factor, allowing each computing node to adaptively mix the statistical parameters of the current batch with the global statistics. Our HBN can serve as a powerful plugin to advance federated learning performance. It reflects promising merits across a wide range of federated learning settings, especially for small batch sizes and heterogeneous data. Code is available at https://github.com/Hongyao-Chen/HybridBN.} }
Endnote
%0 Conference Paper %T Hybrid Batch Normalisation: Resolving the Dilemma of Batch Normalisation in Federated Learning %A Hongyao Chen %A Tianyang Xu %A Xiaojun Wu %A Josef Kittler %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-chen25by %I PMLR %P 9500--9515 %U https://proceedings.mlr.press/v267/chen25by.html %V 267 %X Batch Normalisation (BN) is widely used in conventional deep neural network training to harmonise the input-output distributions for each batch of data. However, federated learning, a distributed learning paradigm, faces the challenge of dealing with non-independent and identically distributed data among the client nodes. Due to the lack of a coherent methodology for updating BN statistical parameters, standard BN degrades the federated learning performance. To this end, it is urgent to explore an alternative normalisation solution for federated learning. In this work, we resolve the dilemma of the BN layer in federated learning by developing a customised normalisation approach, Hybrid Batch Normalisation (HBN). HBN separates the update of statistical parameters (i.e., means and variances used for evaluation) from that of learnable parameters (i.e., parameters that require gradient updates), obtaining unbiased estimates of global statistical parameters in distributed scenarios. In contrast with the existing solutions, we emphasise the supportive power of global statistics for federated learning. The HBN layer introduces a learnable hybrid distribution factor, allowing each computing node to adaptively mix the statistical parameters of the current batch with the global statistics. Our HBN can serve as a powerful plugin to advance federated learning performance. It reflects promising merits across a wide range of federated learning settings, especially for small batch sizes and heterogeneous data. Code is available at https://github.com/Hongyao-Chen/HybridBN.
APA
Chen, H., Xu, T., Wu, X. & Kittler, J.. (2025). Hybrid Batch Normalisation: Resolving the Dilemma of Batch Normalisation in Federated Learning. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:9500-9515 Available from https://proceedings.mlr.press/v267/chen25by.html.

Related Material