Efficient Contrastive Learning for Fast and Accurate Inference on Graphs

Teng Xiao, Huaisheng Zhu, Zhiwei Zhang, Zhimeng Guo, Charu C. Aggarwal, Suhang Wang, Vasant G Honavar
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:54363-54381, 2024.

Abstract

Graph contrastive learning has made remarkable advances in settings where there is a scarcity of task-specific labels. Despite these advances, the significant computational overhead for representation inference incurred by existing methods that rely on intensive message passing makes them unsuitable for latency-constrained applications. In this paper, we present GraphECL, a simple and efficient contrastive learning method for fast inference on graphs. GraphECL does away with the need for expensive message passing during inference. Specifically, it introduces a novel coupling of the MLP and GNN models, where the former learns to computationally efficiently mimic the computations performed by the latter. We provide a theoretical analysis showing why MLP can capture essential structural information in neighbors well enough to match the performance of GNN in downstream tasks. The extensive experiments on widely used real-world benchmarks that show that GraphECL achieves superior performance and inference efficiency compared to state-of-the-art graph constrastive learning (GCL) methods on homophilous and heterophilous graphs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-xiao24g, title = {Efficient Contrastive Learning for Fast and Accurate Inference on Graphs}, author = {Xiao, Teng and Zhu, Huaisheng and Zhang, Zhiwei and Guo, Zhimeng and Aggarwal, Charu C. and Wang, Suhang and Honavar, Vasant G}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {54363--54381}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/xiao24g/xiao24g.pdf}, url = {https://proceedings.mlr.press/v235/xiao24g.html}, abstract = {Graph contrastive learning has made remarkable advances in settings where there is a scarcity of task-specific labels. Despite these advances, the significant computational overhead for representation inference incurred by existing methods that rely on intensive message passing makes them unsuitable for latency-constrained applications. In this paper, we present GraphECL, a simple and efficient contrastive learning method for fast inference on graphs. GraphECL does away with the need for expensive message passing during inference. Specifically, it introduces a novel coupling of the MLP and GNN models, where the former learns to computationally efficiently mimic the computations performed by the latter. We provide a theoretical analysis showing why MLP can capture essential structural information in neighbors well enough to match the performance of GNN in downstream tasks. The extensive experiments on widely used real-world benchmarks that show that GraphECL achieves superior performance and inference efficiency compared to state-of-the-art graph constrastive learning (GCL) methods on homophilous and heterophilous graphs.} }
Endnote
%0 Conference Paper %T Efficient Contrastive Learning for Fast and Accurate Inference on Graphs %A Teng Xiao %A Huaisheng Zhu %A Zhiwei Zhang %A Zhimeng Guo %A Charu C. Aggarwal %A Suhang Wang %A Vasant G Honavar %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-xiao24g %I PMLR %P 54363--54381 %U https://proceedings.mlr.press/v235/xiao24g.html %V 235 %X Graph contrastive learning has made remarkable advances in settings where there is a scarcity of task-specific labels. Despite these advances, the significant computational overhead for representation inference incurred by existing methods that rely on intensive message passing makes them unsuitable for latency-constrained applications. In this paper, we present GraphECL, a simple and efficient contrastive learning method for fast inference on graphs. GraphECL does away with the need for expensive message passing during inference. Specifically, it introduces a novel coupling of the MLP and GNN models, where the former learns to computationally efficiently mimic the computations performed by the latter. We provide a theoretical analysis showing why MLP can capture essential structural information in neighbors well enough to match the performance of GNN in downstream tasks. The extensive experiments on widely used real-world benchmarks that show that GraphECL achieves superior performance and inference efficiency compared to state-of-the-art graph constrastive learning (GCL) methods on homophilous and heterophilous graphs.
APA
Xiao, T., Zhu, H., Zhang, Z., Guo, Z., Aggarwal, C.C., Wang, S. & Honavar, V.G.. (2024). Efficient Contrastive Learning for Fast and Accurate Inference on Graphs. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:54363-54381 Available from https://proceedings.mlr.press/v235/xiao24g.html.

Related Material