EAGLES: Towards Effective, Efficient, and Economical Federated Graph Learning via Unified Sparsification

Zitong Shi, Guancheng Wan, Wenke Huang, Guibin Zhang, He Li, Carl Yang, Mang Ye
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:55046-55064, 2025.

Abstract

Federated Graph Learning (FGL) has gained significant attention as a privacy-preserving approach to collaborative learning, but the computational demands increase substantially as datasets grow and Graph Neural Network (GNN) layers deepen. To address these challenges, we propose $\textbf{EAGLES}$, a unified sparsification framework. EAGLES applies client-consensus parameter sparsification to generate multiple unbiased subnetworks at varying sparsity levels, reducing the need for iterative adjustments and mitigating performance degradation. In the graph structure domain, we introduced a dual-expert approach: a $\textit{graph sparsification expert}$ uses multi-criteria node-level sparsification, and a $\textit{graph synergy expert}$ integrates contextual node information to produce optimal sparse subgraphs. Furthermore, the framework introduces a novel distance metric that leverages node contextual information to measure structural similarity among clients, fostering effective knowledge sharing. We also introduce the $\textbf{Harmony Sparsification Principle}$, EAGLES balances model performance with lightweight graph and model structures. Extensive experiments demonstrate its superiority, achieving competitive performance on various datasets, such as reducing training FLOPS by 82% $\downarrow$ and communication costs by 80% $\downarrow$ on the ogbn-proteins dataset, while maintaining high performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-shi25j, title = {{EAGLES}: Towards Effective, Efficient, and Economical Federated Graph Learning via Unified Sparsification}, author = {Shi, Zitong and Wan, Guancheng and Huang, Wenke and Zhang, Guibin and Li, He and Yang, Carl and Ye, Mang}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {55046--55064}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/shi25j/shi25j.pdf}, url = {https://proceedings.mlr.press/v267/shi25j.html}, abstract = {Federated Graph Learning (FGL) has gained significant attention as a privacy-preserving approach to collaborative learning, but the computational demands increase substantially as datasets grow and Graph Neural Network (GNN) layers deepen. To address these challenges, we propose $\textbf{EAGLES}$, a unified sparsification framework. EAGLES applies client-consensus parameter sparsification to generate multiple unbiased subnetworks at varying sparsity levels, reducing the need for iterative adjustments and mitigating performance degradation. In the graph structure domain, we introduced a dual-expert approach: a $\textit{graph sparsification expert}$ uses multi-criteria node-level sparsification, and a $\textit{graph synergy expert}$ integrates contextual node information to produce optimal sparse subgraphs. Furthermore, the framework introduces a novel distance metric that leverages node contextual information to measure structural similarity among clients, fostering effective knowledge sharing. We also introduce the $\textbf{Harmony Sparsification Principle}$, EAGLES balances model performance with lightweight graph and model structures. Extensive experiments demonstrate its superiority, achieving competitive performance on various datasets, such as reducing training FLOPS by 82% $\downarrow$ and communication costs by 80% $\downarrow$ on the ogbn-proteins dataset, while maintaining high performance.} }
Endnote
%0 Conference Paper %T EAGLES: Towards Effective, Efficient, and Economical Federated Graph Learning via Unified Sparsification %A Zitong Shi %A Guancheng Wan %A Wenke Huang %A Guibin Zhang %A He Li %A Carl Yang %A Mang Ye %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-shi25j %I PMLR %P 55046--55064 %U https://proceedings.mlr.press/v267/shi25j.html %V 267 %X Federated Graph Learning (FGL) has gained significant attention as a privacy-preserving approach to collaborative learning, but the computational demands increase substantially as datasets grow and Graph Neural Network (GNN) layers deepen. To address these challenges, we propose $\textbf{EAGLES}$, a unified sparsification framework. EAGLES applies client-consensus parameter sparsification to generate multiple unbiased subnetworks at varying sparsity levels, reducing the need for iterative adjustments and mitigating performance degradation. In the graph structure domain, we introduced a dual-expert approach: a $\textit{graph sparsification expert}$ uses multi-criteria node-level sparsification, and a $\textit{graph synergy expert}$ integrates contextual node information to produce optimal sparse subgraphs. Furthermore, the framework introduces a novel distance metric that leverages node contextual information to measure structural similarity among clients, fostering effective knowledge sharing. We also introduce the $\textbf{Harmony Sparsification Principle}$, EAGLES balances model performance with lightweight graph and model structures. Extensive experiments demonstrate its superiority, achieving competitive performance on various datasets, such as reducing training FLOPS by 82% $\downarrow$ and communication costs by 80% $\downarrow$ on the ogbn-proteins dataset, while maintaining high performance.
APA
Shi, Z., Wan, G., Huang, W., Zhang, G., Li, H., Yang, C. & Ye, M.. (2025). EAGLES: Towards Effective, Efficient, and Economical Federated Graph Learning via Unified Sparsification. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:55046-55064 Available from https://proceedings.mlr.press/v267/shi25j.html.

Related Material