Byzantine-Robust Federated Learning with Optimal Statistical Rates

Banghua Zhu, Lun Wang, Qi Pang, Shuai Wang, Jiantao Jiao, Dawn Song, Michael I. Jordan
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:3151-3178, 2023.

Abstract

We propose Byzantine-robust federated learning protocols with nearly optimal statistical rates based on recent progress in high dimensional robust statistics. In contrast to prior work, our proposed protocols improve the dimension dependence and achieve a near-optimal statistical rate for strongly convex losses. We also provide statistical lower bound for the problem. For experiments, we benchmark against competing protocols and show the empirical superiority of the proposed protocols.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-zhu23b, title = {Byzantine-Robust Federated Learning with Optimal Statistical Rates}, author = {Zhu, Banghua and Wang, Lun and Pang, Qi and Wang, Shuai and Jiao, Jiantao and Song, Dawn and Jordan, Michael I.}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {3151--3178}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/zhu23b/zhu23b.pdf}, url = {https://proceedings.mlr.press/v206/zhu23b.html}, abstract = {We propose Byzantine-robust federated learning protocols with nearly optimal statistical rates based on recent progress in high dimensional robust statistics. In contrast to prior work, our proposed protocols improve the dimension dependence and achieve a near-optimal statistical rate for strongly convex losses. We also provide statistical lower bound for the problem. For experiments, we benchmark against competing protocols and show the empirical superiority of the proposed protocols.} }
Endnote
%0 Conference Paper %T Byzantine-Robust Federated Learning with Optimal Statistical Rates %A Banghua Zhu %A Lun Wang %A Qi Pang %A Shuai Wang %A Jiantao Jiao %A Dawn Song %A Michael I. Jordan %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-zhu23b %I PMLR %P 3151--3178 %U https://proceedings.mlr.press/v206/zhu23b.html %V 206 %X We propose Byzantine-robust federated learning protocols with nearly optimal statistical rates based on recent progress in high dimensional robust statistics. In contrast to prior work, our proposed protocols improve the dimension dependence and achieve a near-optimal statistical rate for strongly convex losses. We also provide statistical lower bound for the problem. For experiments, we benchmark against competing protocols and show the empirical superiority of the proposed protocols.
APA
Zhu, B., Wang, L., Pang, Q., Wang, S., Jiao, J., Song, D. & Jordan, M.I.. (2023). Byzantine-Robust Federated Learning with Optimal Statistical Rates. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:3151-3178 Available from https://proceedings.mlr.press/v206/zhu23b.html.

Related Material