FedDGL: Federated Dynamic Graph Learning for Temporal Evolution and Data Heterogeneity

Zaipeng Xie, Li Likun, Xiangbin Chen, Hao Yu, Qian Huang
Proceedings of the 16th Asian Conference on Machine Learning, PMLR 260:463-478, 2025.

Abstract

Federated graph learning enhances federated learning by enabling privacy-preserving collaborative training on distributed graph data. While traditional methods are effective in managing data heterogeneity, they typically assume static graph structures, overlooking the dynamic nature of real-world graphs. Integrating federated graph learning with dynamic graph neural networks addresses this issue but often fails to retain previously acquired knowledge, limiting generalization for both global and personalized models. This paper proposes FedDGL, a novel framework designed to address temporal evolution , and data heterogeneity in federated dynamic graph learning. Unlike conventional approaches, FedDGL captures temporal dynamics through a global knowledge distillation technique and manages client heterogeneity using a global prototype-based regularization method. The framework employs contrastive learning to generate global prototypes, enhancing feature representation while utilizing a prototype similarity-based personalized aggregation strategy for effective adaptation to local and global data distributions. Experiments on multiple benchmark datasets show that FedDGL achieves significant performance improvements over state-of-the-art methods, with up to 9.02% and 8.77% gains in local and global testing, respectively, compared to FedAvg. These results highlight FedDGL’s effectiveness in improving personalized and global model performance in dynamic, heterogeneous federated graph learning scenarios.

Cite this Paper


BibTeX
@InProceedings{pmlr-v260-xie25b, title = {{FedDGL}: {F}ederated Dynamic Graph Learning for Temporal Evolution and Data Heterogeneity}, author = {Xie, Zaipeng and Likun, Li and Chen, Xiangbin and Yu, Hao and Huang, Qian}, booktitle = {Proceedings of the 16th Asian Conference on Machine Learning}, pages = {463--478}, year = {2025}, editor = {Nguyen, Vu and Lin, Hsuan-Tien}, volume = {260}, series = {Proceedings of Machine Learning Research}, month = {05--08 Dec}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v260/main/assets/xie25b/xie25b.pdf}, url = {https://proceedings.mlr.press/v260/xie25b.html}, abstract = {Federated graph learning enhances federated learning by enabling privacy-preserving collaborative training on distributed graph data. While traditional methods are effective in managing data heterogeneity, they typically assume static graph structures, overlooking the dynamic nature of real-world graphs. Integrating federated graph learning with dynamic graph neural networks addresses this issue but often fails to retain previously acquired knowledge, limiting generalization for both global and personalized models. This paper proposes FedDGL, a novel framework designed to address temporal evolution , and data heterogeneity in federated dynamic graph learning. Unlike conventional approaches, FedDGL captures temporal dynamics through a global knowledge distillation technique and manages client heterogeneity using a global prototype-based regularization method. The framework employs contrastive learning to generate global prototypes, enhancing feature representation while utilizing a prototype similarity-based personalized aggregation strategy for effective adaptation to local and global data distributions. Experiments on multiple benchmark datasets show that FedDGL achieves significant performance improvements over state-of-the-art methods, with up to 9.02% and 8.77% gains in local and global testing, respectively, compared to FedAvg. These results highlight FedDGL’s effectiveness in improving personalized and global model performance in dynamic, heterogeneous federated graph learning scenarios.} }
Endnote
%0 Conference Paper %T FedDGL: Federated Dynamic Graph Learning for Temporal Evolution and Data Heterogeneity %A Zaipeng Xie %A Li Likun %A Xiangbin Chen %A Hao Yu %A Qian Huang %B Proceedings of the 16th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Vu Nguyen %E Hsuan-Tien Lin %F pmlr-v260-xie25b %I PMLR %P 463--478 %U https://proceedings.mlr.press/v260/xie25b.html %V 260 %X Federated graph learning enhances federated learning by enabling privacy-preserving collaborative training on distributed graph data. While traditional methods are effective in managing data heterogeneity, they typically assume static graph structures, overlooking the dynamic nature of real-world graphs. Integrating federated graph learning with dynamic graph neural networks addresses this issue but often fails to retain previously acquired knowledge, limiting generalization for both global and personalized models. This paper proposes FedDGL, a novel framework designed to address temporal evolution , and data heterogeneity in federated dynamic graph learning. Unlike conventional approaches, FedDGL captures temporal dynamics through a global knowledge distillation technique and manages client heterogeneity using a global prototype-based regularization method. The framework employs contrastive learning to generate global prototypes, enhancing feature representation while utilizing a prototype similarity-based personalized aggregation strategy for effective adaptation to local and global data distributions. Experiments on multiple benchmark datasets show that FedDGL achieves significant performance improvements over state-of-the-art methods, with up to 9.02% and 8.77% gains in local and global testing, respectively, compared to FedAvg. These results highlight FedDGL’s effectiveness in improving personalized and global model performance in dynamic, heterogeneous federated graph learning scenarios.
APA
Xie, Z., Likun, L., Chen, X., Yu, H. & Huang, Q.. (2025). FedDGL: Federated Dynamic Graph Learning for Temporal Evolution and Data Heterogeneity. Proceedings of the 16th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 260:463-478 Available from https://proceedings.mlr.press/v260/xie25b.html.

Related Material