LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation

Rui Xue, Haoyu Han, Mohamadali Torkamani, Jian Pei, Xiaorui Liu
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:38926-38937, 2023.

Abstract

Recent works have demonstrated the benefits of capturing long-distance dependency in graphs by deeper graph neural networks (GNNs). But deeper GNNs suffer from the long-lasting scalability challenge due to the neighborhood explosion problem in large-scale graphs. In this work, we propose to capture long-distance dependency in graphs by shallower models instead of deeper models, which leads to a much more efficient model, LazyGNN, for graph representation learning. Moreover, we demonstrate that LazyGNN is compatible with existing scalable approaches (such as sampling methods) for further accelerations through the development of mini-batch LazyGNN. Comprehensive experiments demonstrate its superior prediction performance and scalability on large-scale benchmarks. The implementation of LazyGNN is available at https: //github.com/RXPHD/Lazy_GNN.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-xue23c, title = {{L}azy{GNN}: Large-Scale Graph Neural Networks via Lazy Propagation}, author = {Xue, Rui and Han, Haoyu and Torkamani, Mohamadali and Pei, Jian and Liu, Xiaorui}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {38926--38937}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/xue23c/xue23c.pdf}, url = {https://proceedings.mlr.press/v202/xue23c.html}, abstract = {Recent works have demonstrated the benefits of capturing long-distance dependency in graphs by deeper graph neural networks (GNNs). But deeper GNNs suffer from the long-lasting scalability challenge due to the neighborhood explosion problem in large-scale graphs. In this work, we propose to capture long-distance dependency in graphs by shallower models instead of deeper models, which leads to a much more efficient model, LazyGNN, for graph representation learning. Moreover, we demonstrate that LazyGNN is compatible with existing scalable approaches (such as sampling methods) for further accelerations through the development of mini-batch LazyGNN. Comprehensive experiments demonstrate its superior prediction performance and scalability on large-scale benchmarks. The implementation of LazyGNN is available at https: //github.com/RXPHD/Lazy_GNN.} }
Endnote
%0 Conference Paper %T LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation %A Rui Xue %A Haoyu Han %A Mohamadali Torkamani %A Jian Pei %A Xiaorui Liu %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-xue23c %I PMLR %P 38926--38937 %U https://proceedings.mlr.press/v202/xue23c.html %V 202 %X Recent works have demonstrated the benefits of capturing long-distance dependency in graphs by deeper graph neural networks (GNNs). But deeper GNNs suffer from the long-lasting scalability challenge due to the neighborhood explosion problem in large-scale graphs. In this work, we propose to capture long-distance dependency in graphs by shallower models instead of deeper models, which leads to a much more efficient model, LazyGNN, for graph representation learning. Moreover, we demonstrate that LazyGNN is compatible with existing scalable approaches (such as sampling methods) for further accelerations through the development of mini-batch LazyGNN. Comprehensive experiments demonstrate its superior prediction performance and scalability on large-scale benchmarks. The implementation of LazyGNN is available at https: //github.com/RXPHD/Lazy_GNN.
APA
Xue, R., Han, H., Torkamani, M., Pei, J. & Liu, X.. (2023). LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:38926-38937 Available from https://proceedings.mlr.press/v202/xue23c.html.

Related Material