Simple and Deep Graph Convolutional Networks

Ming Chen, Zhewei Wei, Zengfeng Huang, Bolin Ding, Yaliang Li
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1725-1735, 2020.

Abstract

Graph convolutional networks (GCNs) are a powerful deep learning approach for graph-structured data. Recently, GCNs and subsequent variants have shown superior performance in various application areas on real-world datasets. Despite their success, most of the current GCN models are shallow, due to the \emph{over-smoothing} problem. In this paper, we study the problem of designing and analyzing deep graph convolutional networks. We propose the GCNII, an extension of the vanilla GCN model with two simple yet effective techniques: \emph{Initial residual} and \emph{Identity mapping}. We provide theoretical and empirical evidence that the two techniques effectively relieves the problem of over-smoothing. Our experiments show that the deep GCNII model outperforms the state-of-the-art methods on various semi- and full-supervised tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-chen20v, title = {Simple and Deep Graph Convolutional Networks}, author = {Chen, Ming and Wei, Zhewei and Huang, Zengfeng and Ding, Bolin and Li, Yaliang}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1725--1735}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/chen20v/chen20v.pdf}, url = {http://proceedings.mlr.press/v119/chen20v.html}, abstract = {Graph convolutional networks (GCNs) are a powerful deep learning approach for graph-structured data. Recently, GCNs and subsequent variants have shown superior performance in various application areas on real-world datasets. Despite their success, most of the current GCN models are shallow, due to the \emph{over-smoothing} problem. In this paper, we study the problem of designing and analyzing deep graph convolutional networks. We propose the GCNII, an extension of the vanilla GCN model with two simple yet effective techniques: \emph{Initial residual} and \emph{Identity mapping}. We provide theoretical and empirical evidence that the two techniques effectively relieves the problem of over-smoothing. Our experiments show that the deep GCNII model outperforms the state-of-the-art methods on various semi- and full-supervised tasks.} }
Endnote
%0 Conference Paper %T Simple and Deep Graph Convolutional Networks %A Ming Chen %A Zhewei Wei %A Zengfeng Huang %A Bolin Ding %A Yaliang Li %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-chen20v %I PMLR %P 1725--1735 %U http://proceedings.mlr.press/v119/chen20v.html %V 119 %X Graph convolutional networks (GCNs) are a powerful deep learning approach for graph-structured data. Recently, GCNs and subsequent variants have shown superior performance in various application areas on real-world datasets. Despite their success, most of the current GCN models are shallow, due to the \emph{over-smoothing} problem. In this paper, we study the problem of designing and analyzing deep graph convolutional networks. We propose the GCNII, an extension of the vanilla GCN model with two simple yet effective techniques: \emph{Initial residual} and \emph{Identity mapping}. We provide theoretical and empirical evidence that the two techniques effectively relieves the problem of over-smoothing. Our experiments show that the deep GCNII model outperforms the state-of-the-art methods on various semi- and full-supervised tasks.
APA
Chen, M., Wei, Z., Huang, Z., Ding, B. & Li, Y.. (2020). Simple and Deep Graph Convolutional Networks. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1725-1735 Available from http://proceedings.mlr.press/v119/chen20v.html.

Related Material