Understanding the Statistical Accuracy-Communication Trade-off in Personalized Federated Learning with Minimax Guarantees

Xin Yu, Zelin He, Ying Sun, Lingzhou Xue, Runze Li
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:73009-73050, 2025.

Abstract

Personalized federated learning (PFL) offers a flexible framework for aggregating information across distributed clients with heterogeneous data. This work considers a personalized federated learning setting that simultaneously learns global and local models. While purely local training has no communication cost, collaborative learning among the clients can leverage shared knowledge to improve statistical accuracy, presenting an accuracy-communication trade-off in personalized federated learning. However, the theoretical analysis of how personalization quantitatively influences sample and algorithmic efficiency and their inherent trade-off is largely unexplored. This paper makes a contribution towards filling this gap, by providing a quantitative characterization of the personalization degree on the tradeoff. The results further offer theoretical insights for choosing the personalization degree. As a side contribution, we establish the minimax optimality in terms of statistical accuracy for a widely studied PFL formulation. The theoretical result is validated on both synthetic and real-world datasets and its generalizability is verified in a non-convex setting.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-yu25h, title = {Understanding the Statistical Accuracy-Communication Trade-off in Personalized Federated Learning with Minimax Guarantees}, author = {Yu, Xin and He, Zelin and Sun, Ying and Xue, Lingzhou and Li, Runze}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {73009--73050}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/yu25h/yu25h.pdf}, url = {https://proceedings.mlr.press/v267/yu25h.html}, abstract = {Personalized federated learning (PFL) offers a flexible framework for aggregating information across distributed clients with heterogeneous data. This work considers a personalized federated learning setting that simultaneously learns global and local models. While purely local training has no communication cost, collaborative learning among the clients can leverage shared knowledge to improve statistical accuracy, presenting an accuracy-communication trade-off in personalized federated learning. However, the theoretical analysis of how personalization quantitatively influences sample and algorithmic efficiency and their inherent trade-off is largely unexplored. This paper makes a contribution towards filling this gap, by providing a quantitative characterization of the personalization degree on the tradeoff. The results further offer theoretical insights for choosing the personalization degree. As a side contribution, we establish the minimax optimality in terms of statistical accuracy for a widely studied PFL formulation. The theoretical result is validated on both synthetic and real-world datasets and its generalizability is verified in a non-convex setting.} }
Endnote
%0 Conference Paper %T Understanding the Statistical Accuracy-Communication Trade-off in Personalized Federated Learning with Minimax Guarantees %A Xin Yu %A Zelin He %A Ying Sun %A Lingzhou Xue %A Runze Li %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-yu25h %I PMLR %P 73009--73050 %U https://proceedings.mlr.press/v267/yu25h.html %V 267 %X Personalized federated learning (PFL) offers a flexible framework for aggregating information across distributed clients with heterogeneous data. This work considers a personalized federated learning setting that simultaneously learns global and local models. While purely local training has no communication cost, collaborative learning among the clients can leverage shared knowledge to improve statistical accuracy, presenting an accuracy-communication trade-off in personalized federated learning. However, the theoretical analysis of how personalization quantitatively influences sample and algorithmic efficiency and their inherent trade-off is largely unexplored. This paper makes a contribution towards filling this gap, by providing a quantitative characterization of the personalization degree on the tradeoff. The results further offer theoretical insights for choosing the personalization degree. As a side contribution, we establish the minimax optimality in terms of statistical accuracy for a widely studied PFL formulation. The theoretical result is validated on both synthetic and real-world datasets and its generalizability is verified in a non-convex setting.
APA
Yu, X., He, Z., Sun, Y., Xue, L. & Li, R.. (2025). Understanding the Statistical Accuracy-Communication Trade-off in Personalized Federated Learning with Minimax Guarantees. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:73009-73050 Available from https://proceedings.mlr.press/v267/yu25h.html.

Related Material