P-tensors: a General Framework for Higher Order Message Passing in Subgraph Neural Networks

Andrew R. Hands, Tianyi Sun, Risi Kondor
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:424-432, 2024.

Abstract

Several recent papers have proposed increasing the expressiveness of graph neural networks by exploiting subgraphs or other topological structures. In parallel, researchers have investigated higher order permutation equivariant networks. In this paper we tie these two threads together by providing a general framework for higher order permutation equivariant message passing in subgraph neural networks. Our exposition hinges on so-called $P$-tensors, which provide a simple way to define the most general form of permutation equivariant message passing in this category of networks. We show that this paradigm can achieve state-of-the-art performance on benchmark molecular datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-hands24a, title = {P-tensors: a General Framework for Higher Order Message Passing in Subgraph Neural Networks}, author = {Hands, Andrew R. and Sun, Tianyi and Kondor, Risi}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {424--432}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/hands24a/hands24a.pdf}, url = {https://proceedings.mlr.press/v238/hands24a.html}, abstract = {Several recent papers have proposed increasing the expressiveness of graph neural networks by exploiting subgraphs or other topological structures. In parallel, researchers have investigated higher order permutation equivariant networks. In this paper we tie these two threads together by providing a general framework for higher order permutation equivariant message passing in subgraph neural networks. Our exposition hinges on so-called $P$-tensors, which provide a simple way to define the most general form of permutation equivariant message passing in this category of networks. We show that this paradigm can achieve state-of-the-art performance on benchmark molecular datasets.} }
Endnote
%0 Conference Paper %T P-tensors: a General Framework for Higher Order Message Passing in Subgraph Neural Networks %A Andrew R. Hands %A Tianyi Sun %A Risi Kondor %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-hands24a %I PMLR %P 424--432 %U https://proceedings.mlr.press/v238/hands24a.html %V 238 %X Several recent papers have proposed increasing the expressiveness of graph neural networks by exploiting subgraphs or other topological structures. In parallel, researchers have investigated higher order permutation equivariant networks. In this paper we tie these two threads together by providing a general framework for higher order permutation equivariant message passing in subgraph neural networks. Our exposition hinges on so-called $P$-tensors, which provide a simple way to define the most general form of permutation equivariant message passing in this category of networks. We show that this paradigm can achieve state-of-the-art performance on benchmark molecular datasets.
APA
Hands, A.R., Sun, T. & Kondor, R.. (2024). P-tensors: a General Framework for Higher Order Message Passing in Subgraph Neural Networks. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:424-432 Available from https://proceedings.mlr.press/v238/hands24a.html.

Related Material