[edit]
FedDGL: Federated Dynamic Graph Learning for Temporal Evolution and Data Heterogeneity
Proceedings of the 16th Asian Conference on Machine Learning, PMLR 260:463-478, 2025.
Abstract
Federated graph learning enhances federated learning by enabling privacy-preserving collaborative training on distributed graph data. While traditional methods are effective in managing data heterogeneity, they typically assume static graph structures, overlooking the dynamic nature of real-world graphs. Integrating federated graph learning with dynamic graph neural networks addresses this issue but often fails to retain previously acquired knowledge, limiting generalization for both global and personalized models. This paper proposes FedDGL, a novel framework designed to address temporal evolution , and data heterogeneity in federated dynamic graph learning. Unlike conventional approaches, FedDGL captures temporal dynamics through a global knowledge distillation technique and manages client heterogeneity using a global prototype-based regularization method. The framework employs contrastive learning to generate global prototypes, enhancing feature representation while utilizing a prototype similarity-based personalized aggregation strategy for effective adaptation to local and global data distributions. Experiments on multiple benchmark datasets show that FedDGL achieves significant performance improvements over state-of-the-art methods, with up to 9.02% and 8.77% gains in local and global testing, respectively, compared to FedAvg. These results highlight FedDGL’s effectiveness in improving personalized and global model performance in dynamic, heterogeneous federated graph learning scenarios.