Graph Convolutional Networks from the Perspective of Sheaves and the Neural Tangent Kernel

Thomas Gebhart
Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022, PMLR 196:124-132, 2022.

Abstract

Graph convolutional networks are a popular class of deep neural network algorithms which have shown success in a number of relational learning tasks. Despite their success, graph convolutional networks exhibit a number of peculiar features, including a bias towards learning oversmoothed and homophilic functions, which are not easily diagnosed due to the complex nature of these algorithms. We propose to bridge this gap in understanding by studying the neural tangent kernel of sheaf convolutional networks–a topological generalization of graph convolutional networks. To this end, we derive a parameterization of the neural tangent kernel for sheaf convolutional networks which separates the function into two parts: one driven by a forward diffusion process determined by the graph, and the other determined by the composite effect of nodes’ activations on the output layer. This geometrically-focused derivation produces a number of immediate insights which we discuss in detail.

Cite this Paper


BibTeX
@InProceedings{pmlr-v196-gebhart22a, title = {Graph Convolutional Networks from the Perspective of Sheaves and the Neural Tangent Kernel}, author = {Gebhart, Thomas}, booktitle = {Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022}, pages = {124--132}, year = {2022}, editor = {Cloninger, Alexander and Doster, Timothy and Emerson, Tegan and Kaul, Manohar and Ktena, Ira and Kvinge, Henry and Miolane, Nina and Rieck, Bastian and Tymochko, Sarah and Wolf, Guy}, volume = {196}, series = {Proceedings of Machine Learning Research}, month = {25 Feb--22 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v196/gebhart22a/gebhart22a.pdf}, url = {https://proceedings.mlr.press/v196/gebhart22a.html}, abstract = {Graph convolutional networks are a popular class of deep neural network algorithms which have shown success in a number of relational learning tasks. Despite their success, graph convolutional networks exhibit a number of peculiar features, including a bias towards learning oversmoothed and homophilic functions, which are not easily diagnosed due to the complex nature of these algorithms. We propose to bridge this gap in understanding by studying the neural tangent kernel of sheaf convolutional networks–a topological generalization of graph convolutional networks. To this end, we derive a parameterization of the neural tangent kernel for sheaf convolutional networks which separates the function into two parts: one driven by a forward diffusion process determined by the graph, and the other determined by the composite effect of nodes’ activations on the output layer. This geometrically-focused derivation produces a number of immediate insights which we discuss in detail.} }
Endnote
%0 Conference Paper %T Graph Convolutional Networks from the Perspective of Sheaves and the Neural Tangent Kernel %A Thomas Gebhart %B Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022 %C Proceedings of Machine Learning Research %D 2022 %E Alexander Cloninger %E Timothy Doster %E Tegan Emerson %E Manohar Kaul %E Ira Ktena %E Henry Kvinge %E Nina Miolane %E Bastian Rieck %E Sarah Tymochko %E Guy Wolf %F pmlr-v196-gebhart22a %I PMLR %P 124--132 %U https://proceedings.mlr.press/v196/gebhart22a.html %V 196 %X Graph convolutional networks are a popular class of deep neural network algorithms which have shown success in a number of relational learning tasks. Despite their success, graph convolutional networks exhibit a number of peculiar features, including a bias towards learning oversmoothed and homophilic functions, which are not easily diagnosed due to the complex nature of these algorithms. We propose to bridge this gap in understanding by studying the neural tangent kernel of sheaf convolutional networks–a topological generalization of graph convolutional networks. To this end, we derive a parameterization of the neural tangent kernel for sheaf convolutional networks which separates the function into two parts: one driven by a forward diffusion process determined by the graph, and the other determined by the composite effect of nodes’ activations on the output layer. This geometrically-focused derivation produces a number of immediate insights which we discuss in detail.
APA
Gebhart, T.. (2022). Graph Convolutional Networks from the Perspective of Sheaves and the Neural Tangent Kernel. Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022, in Proceedings of Machine Learning Research 196:124-132 Available from https://proceedings.mlr.press/v196/gebhart22a.html.

Related Material