[edit]
P-tensors: a General Framework for Higher Order Message Passing in Subgraph Neural Networks
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:424-432, 2024.
Abstract
Several recent papers have proposed increasing the expressiveness of graph neural networks by exploiting subgraphs or other topological structures. In parallel, researchers have investigated higher order permutation equivariant networks. In this paper we tie these two threads together by providing a general framework for higher order permutation equivariant message passing in subgraph neural networks. Our exposition hinges on so-called $P$-tensors, which provide a simple way to define the most general form of permutation equivariant message passing in this category of networks. We show that this paradigm can achieve state-of-the-art performance on benchmark molecular datasets.