[edit]
Byzantine-Resilient Federated Alternating Gradient Descent and Minimization for Partly-Decoupled Low Rank Matrix Learning
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:55684-55701, 2025.
Abstract
This work has two contributions. First, we introduce a provably secure (Byzantine-resilient) sample- and communication-efficient alternating gradient descent (GD) and minimization based algorithms for solving the federated low rank matrix completion (LRMC) problem. This involves learning a low rank (LR) matrix from a small subset of its entries. Second, we extend our ideas to show how a special case of our algorithms also solves another partly-decoupled vertically federated LR matrix learning problem, that is LR column-wise sensing (LRCS), also referred to as multi-task linear representation learning in the literature. Finally, we also show how our results can be extended for the LR phase retrieval problem. In all problems, we consider column-wise or vertical federation, i.e. each node observes a small subset of entries of a disjoint column sub-matrix of the entire LR matrix. For the LRMC problem, horizontal federation is equivalent since it is symmetric across rows and columns; while for the other two it is not. In all problems, the data at different nodes is heterogeneous (not identically distributed), making it harder to obtain provable guarantees.