Transferable Hypergraph Neural Networks via Spectral Similarity

Mikhail Hayhoe, Hans Matthew Riess, Michael M. Zavlanos, VICTOR PRECIADO, Alejandro Ribeiro
Proceedings of the Second Learning on Graphs Conference, PMLR 231:18:1-18:23, 2024.

Abstract

Hypergraphs model higher-order interactions in complex systems, e.g., chemicals reacting only in the presence of an enzyme or rumors spreading across groups, and encompass both the notion of an undirected graph and a simplicial complex. Nonetheless, due to computational complexity, machine learning on hypergraph-structured data is notoriously challenging. In an effort to transfer hypergraph neural network models, addressing this challenge, we extend results on the transferability of Graph Neural Networks (GNNs) to design a convolutional architecture for processing signals supported on hypergraphs via GNNs, which we call Hypergraph Expansion Neural Networks (HENNs). Exploiting multiple spectrally-similar graph representations of hypergraphs, we establish bounds on the transferability error. Experimental results illustrate the importance of considering multiple graph representations in HENNs, and show promise of superior performance when transferability is required.

Cite this Paper


BibTeX
@InProceedings{pmlr-v231-hayhoe24a, title = {Transferable Hypergraph Neural Networks via Spectral Similarity}, author = {Hayhoe, Mikhail and Riess, Hans Matthew and Zavlanos, Michael M. and PRECIADO, VICTOR and Ribeiro, Alejandro}, booktitle = {Proceedings of the Second Learning on Graphs Conference}, pages = {18:1--18:23}, year = {2024}, editor = {Villar, Soledad and Chamberlain, Benjamin}, volume = {231}, series = {Proceedings of Machine Learning Research}, month = {27--30 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v231/hayhoe24a/hayhoe24a.pdf}, url = {https://proceedings.mlr.press/v231/hayhoe24a.html}, abstract = {Hypergraphs model higher-order interactions in complex systems, e.g., chemicals reacting only in the presence of an enzyme or rumors spreading across groups, and encompass both the notion of an undirected graph and a simplicial complex. Nonetheless, due to computational complexity, machine learning on hypergraph-structured data is notoriously challenging. In an effort to transfer hypergraph neural network models, addressing this challenge, we extend results on the transferability of Graph Neural Networks (GNNs) to design a convolutional architecture for processing signals supported on hypergraphs via GNNs, which we call Hypergraph Expansion Neural Networks (HENNs). Exploiting multiple spectrally-similar graph representations of hypergraphs, we establish bounds on the transferability error. Experimental results illustrate the importance of considering multiple graph representations in HENNs, and show promise of superior performance when transferability is required.} }
Endnote
%0 Conference Paper %T Transferable Hypergraph Neural Networks via Spectral Similarity %A Mikhail Hayhoe %A Hans Matthew Riess %A Michael M. Zavlanos %A VICTOR PRECIADO %A Alejandro Ribeiro %B Proceedings of the Second Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2024 %E Soledad Villar %E Benjamin Chamberlain %F pmlr-v231-hayhoe24a %I PMLR %P 18:1--18:23 %U https://proceedings.mlr.press/v231/hayhoe24a.html %V 231 %X Hypergraphs model higher-order interactions in complex systems, e.g., chemicals reacting only in the presence of an enzyme or rumors spreading across groups, and encompass both the notion of an undirected graph and a simplicial complex. Nonetheless, due to computational complexity, machine learning on hypergraph-structured data is notoriously challenging. In an effort to transfer hypergraph neural network models, addressing this challenge, we extend results on the transferability of Graph Neural Networks (GNNs) to design a convolutional architecture for processing signals supported on hypergraphs via GNNs, which we call Hypergraph Expansion Neural Networks (HENNs). Exploiting multiple spectrally-similar graph representations of hypergraphs, we establish bounds on the transferability error. Experimental results illustrate the importance of considering multiple graph representations in HENNs, and show promise of superior performance when transferability is required.
APA
Hayhoe, M., Riess, H.M., Zavlanos, M.M., PRECIADO, V. & Ribeiro, A.. (2024). Transferable Hypergraph Neural Networks via Spectral Similarity. Proceedings of the Second Learning on Graphs Conference, in Proceedings of Machine Learning Research 231:18:1-18:23 Available from https://proceedings.mlr.press/v231/hayhoe24a.html.

Related Material