[edit]
PFedAtt: Attention-based Personalized Federated Learning on Heterogeneous Clients
Proceedings of The 13th Asian Conference on Machine Learning, PMLR 157:1253-1268, 2021.
Abstract
In federated learning, heterogeneity among the clients’ local datasets results in large variations in the number of local updates performed by each client in a communication round. Simply aggregating such local models into a global model will confine the capacity of the system, that is, the single global model will be restricted from delivering good performance on each client’s task. This paper provides a general framework to analyze the convergence of personalized federated learning algorithms. It subsumes previously proposed methods and provides a principled understanding of the computational guarantees. Using insights from this analysis, we propose PFedAtt, a personalized federated learning method that incorporates attention-based grouping to facilitate similar clients’ collaborations. Theoretically, we provide the convergence guarantee for the algorithm, and empirical experiments corroborate the competitive performance of PFedAtt on heterogeneous clients.