[edit]
Asynchronous Personalized Federated Learning with Irregular Clients
Proceedings of The 14th Asian Conference on Machine
Learning, PMLR 189:706-721, 2023.
Abstract
To provide intelligent and personalized models for
clients, personalized federated learning (PFL)
enables learning from data, identifying patterns,
and making automated decisions in a
privacy-preserving manner. PFL involves independent
training for multiple clients with synchronous
aggregation steps. However, the assumptions made by
existing works are not realistic given the
heterogeneity of clients. In particular, the volume
and distribution of collected data vary in the
training process, and the clients also vary in their
available system configurations, which leads to vast
heterogeneity in the system. To address these
challenges, we present an \textit{asynchronous}
method (AsyPFL), where clients learn personalized
models w.r.t. local data by making the most
informative parameters less volatile. The central
server aggregates model parameters
asynchronously. In addition, we also reformulate PFL
by unifying both synchronous and asynchronous
updating schemes with an asynchrony-related
parameter. Theoretically, we show that AsyPFL’s
convergence rate is state-of-the-art and provide
guarantees of choosing key hyperparameters
optimally. With these theoretical guarantees, we
validate AsyPFL on different tasks with non-IID and
staleness settings. The results indicate that, given
a large proportion of irregular clients, AsyPFL
excels at empirical performance compared with
vanilla PFL algorithms on non-IID and IID cases.