[edit]
Transferable Hypergraph Neural Networks via Spectral Similarity
Proceedings of the Second Learning on Graphs Conference, PMLR 231:18:1-18:23, 2024.
Abstract
Hypergraphs model higher-order interactions in complex systems, e.g., chemicals reacting only in the presence of an enzyme or rumors spreading across groups, and encompass both the notion of an undirected graph and a simplicial complex. Nonetheless, due to computational complexity, machine learning on hypergraph-structured data is notoriously challenging. In an effort to transfer hypergraph neural network models, addressing this challenge, we extend results on the transferability of Graph Neural Networks (GNNs) to design a convolutional architecture for processing signals supported on hypergraphs via GNNs, which we call Hypergraph Expansion Neural Networks (HENNs). Exploiting multiple spectrally-similar graph representations of hypergraphs, we establish bounds on the transferability error. Experimental results illustrate the importance of considering multiple graph representations in HENNs, and show promise of superior performance when transferability is required.