Improving Graph Contrastive Learning with Community Structure

Xiang Chen, Kun Yue, Liang Duan, Lixing Yu
Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, PMLR 286:568-585, 2025.

Abstract

Graph contrastive learning (GCL) has demonstrated remarkable success in training graph neural networks (GNNs) by distinguishing positive and negative node pairs without human labeling. However, existing GCL methods often suffer from two limitations: the repetitive message-passing mechanism in GNNs and the quadratic computational complexity of exhaustive node pair sampling in loss function. To address these issues, we propose an efficient and effective GCL framework that leverages community structure rather than relying on the intricate node-to-node adjacency information. Inspired by the concept of sparse low-rank approximation of graph diffusion matrices, our model delivers node messages to the corresponding communities instead of individual neighbors. By exploiting community structures, our method significantly improves GCL efficiency by reducing the number of node pairs needed for contrastive loss calculation. Furthermore, we theoretically prove that our model effectively captures essential structure information for downstream tasks. Extensive experiments conducted on real-world datasets illustrate that our method not only achieves the state-of-the-art performance but also substantially reduces time and memory consumption compared with other GCL methods. Our code is available at [https://github.com/chenx-hi/IGCL-CS](https://github.com/chenx-hi/IGCL-CS).

Cite this Paper


BibTeX
@InProceedings{pmlr-v286-chen25a, title = {Improving Graph Contrastive Learning with Community Structure}, author = {Chen, Xiang and Yue, Kun and Duan, Liang and Yu, Lixing}, booktitle = {Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence}, pages = {568--585}, year = {2025}, editor = {Chiappa, Silvia and Magliacane, Sara}, volume = {286}, series = {Proceedings of Machine Learning Research}, month = {21--25 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v286/main/assets/chen25a/chen25a.pdf}, url = {https://proceedings.mlr.press/v286/chen25a.html}, abstract = {Graph contrastive learning (GCL) has demonstrated remarkable success in training graph neural networks (GNNs) by distinguishing positive and negative node pairs without human labeling. However, existing GCL methods often suffer from two limitations: the repetitive message-passing mechanism in GNNs and the quadratic computational complexity of exhaustive node pair sampling in loss function. To address these issues, we propose an efficient and effective GCL framework that leverages community structure rather than relying on the intricate node-to-node adjacency information. Inspired by the concept of sparse low-rank approximation of graph diffusion matrices, our model delivers node messages to the corresponding communities instead of individual neighbors. By exploiting community structures, our method significantly improves GCL efficiency by reducing the number of node pairs needed for contrastive loss calculation. Furthermore, we theoretically prove that our model effectively captures essential structure information for downstream tasks. Extensive experiments conducted on real-world datasets illustrate that our method not only achieves the state-of-the-art performance but also substantially reduces time and memory consumption compared with other GCL methods. Our code is available at [https://github.com/chenx-hi/IGCL-CS](https://github.com/chenx-hi/IGCL-CS).} }
Endnote
%0 Conference Paper %T Improving Graph Contrastive Learning with Community Structure %A Xiang Chen %A Kun Yue %A Liang Duan %A Lixing Yu %B Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2025 %E Silvia Chiappa %E Sara Magliacane %F pmlr-v286-chen25a %I PMLR %P 568--585 %U https://proceedings.mlr.press/v286/chen25a.html %V 286 %X Graph contrastive learning (GCL) has demonstrated remarkable success in training graph neural networks (GNNs) by distinguishing positive and negative node pairs without human labeling. However, existing GCL methods often suffer from two limitations: the repetitive message-passing mechanism in GNNs and the quadratic computational complexity of exhaustive node pair sampling in loss function. To address these issues, we propose an efficient and effective GCL framework that leverages community structure rather than relying on the intricate node-to-node adjacency information. Inspired by the concept of sparse low-rank approximation of graph diffusion matrices, our model delivers node messages to the corresponding communities instead of individual neighbors. By exploiting community structures, our method significantly improves GCL efficiency by reducing the number of node pairs needed for contrastive loss calculation. Furthermore, we theoretically prove that our model effectively captures essential structure information for downstream tasks. Extensive experiments conducted on real-world datasets illustrate that our method not only achieves the state-of-the-art performance but also substantially reduces time and memory consumption compared with other GCL methods. Our code is available at [https://github.com/chenx-hi/IGCL-CS](https://github.com/chenx-hi/IGCL-CS).
APA
Chen, X., Yue, K., Duan, L. & Yu, L.. (2025). Improving Graph Contrastive Learning with Community Structure. Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 286:568-585 Available from https://proceedings.mlr.press/v286/chen25a.html.

Related Material