Provably Near-Optimal Federated Ensemble Distillation with Negligible Overhead

Won-Jun Jang, Hyeon-Seo Park, Si-Hyeon Lee
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:26896-26924, 2025.

Abstract

Federated ensemble distillation addresses client heterogeneity by generating pseudo-labels for an unlabeled server dataset based on client predictions and training the server model using the pseudo-labeled dataset. The unlabeled server dataset can either be pre-existing or generated through a data-free approach. The effectiveness of this approach critically depends on the method of assigning weights to client predictions when creating pseudo-labels, especially in highly heterogeneous settings. Inspired by theoretical results from GANs, we propose a provably near-optimal weighting method that leverages client discriminators trained with a server-distributed generator and local datasets. Our experiments on various image classification tasks demonstrate that the proposed method significantly outperforms baselines. Furthermore, we show that the additional communication cost, client-side privacy leakage, and client-side computational overhead introduced by our method are negligible, both in scenarios with and without a pre-existing server dataset.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-jang25b, title = {Provably Near-Optimal Federated Ensemble Distillation with Negligible Overhead}, author = {Jang, Won-Jun and Park, Hyeon-Seo and Lee, Si-Hyeon}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {26896--26924}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/jang25b/jang25b.pdf}, url = {https://proceedings.mlr.press/v267/jang25b.html}, abstract = {Federated ensemble distillation addresses client heterogeneity by generating pseudo-labels for an unlabeled server dataset based on client predictions and training the server model using the pseudo-labeled dataset. The unlabeled server dataset can either be pre-existing or generated through a data-free approach. The effectiveness of this approach critically depends on the method of assigning weights to client predictions when creating pseudo-labels, especially in highly heterogeneous settings. Inspired by theoretical results from GANs, we propose a provably near-optimal weighting method that leverages client discriminators trained with a server-distributed generator and local datasets. Our experiments on various image classification tasks demonstrate that the proposed method significantly outperforms baselines. Furthermore, we show that the additional communication cost, client-side privacy leakage, and client-side computational overhead introduced by our method are negligible, both in scenarios with and without a pre-existing server dataset.} }
Endnote
%0 Conference Paper %T Provably Near-Optimal Federated Ensemble Distillation with Negligible Overhead %A Won-Jun Jang %A Hyeon-Seo Park %A Si-Hyeon Lee %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-jang25b %I PMLR %P 26896--26924 %U https://proceedings.mlr.press/v267/jang25b.html %V 267 %X Federated ensemble distillation addresses client heterogeneity by generating pseudo-labels for an unlabeled server dataset based on client predictions and training the server model using the pseudo-labeled dataset. The unlabeled server dataset can either be pre-existing or generated through a data-free approach. The effectiveness of this approach critically depends on the method of assigning weights to client predictions when creating pseudo-labels, especially in highly heterogeneous settings. Inspired by theoretical results from GANs, we propose a provably near-optimal weighting method that leverages client discriminators trained with a server-distributed generator and local datasets. Our experiments on various image classification tasks demonstrate that the proposed method significantly outperforms baselines. Furthermore, we show that the additional communication cost, client-side privacy leakage, and client-side computational overhead introduced by our method are negligible, both in scenarios with and without a pre-existing server dataset.
APA
Jang, W., Park, H. & Lee, S.. (2025). Provably Near-Optimal Federated Ensemble Distillation with Negligible Overhead. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:26896-26924 Available from https://proceedings.mlr.press/v267/jang25b.html.

Related Material