Sheaf Diffusion Goes Nonlinear: Enhancing GNNs with Adaptive Sheaf Laplacians

Olga Zaghen, Antonio Longa, Steve Azzolin, Lev Telyatnikov, Andrea Passerini, Pietro Liò
Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM), PMLR 251:264-276, 2024.

Abstract

Sheaf Neural Networks (SNNs) have recently been introduced to enhance Graph Neural Networks (GNNs) in their capability to learn from graphs. Previous studies either focus on linear sheaf Laplacians or hand-crafted nonlinear sheaf Laplacians. The former are not always expressive enough in modeling complex interactions between nodes, such as antagonistic dynamics and bounded confidence dynamics, while the latter use a fixed nonlinear function that is not adapted to the data at hand. To enhance the capability of SNNs to capture complex node-to-node interactions while adapting to different scenarios, we propose a Nonlinear Sheaf Diffusion (NLSD) model, which incorporates nonlinearity into the Laplacian of SNNs through a general function learned from data. Our model is validated on a synthetic community detection dataset, where it outperforms linear SNNs and common GNN baselines in a node classification task, showcasing its ability to leverage complex network dynamics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v251-zaghen24a, title = {Sheaf Diffusion Goes Nonlinear: Enhancing GNNs with Adaptive Sheaf Laplacians}, author = {Zaghen, Olga and Longa, Antonio and Azzolin, Steve and Telyatnikov, Lev and Passerini, Andrea and Li\`o, Pietro}, booktitle = {Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM)}, pages = {264--276}, year = {2024}, editor = {Vadgama, Sharvaree and Bekkers, Erik and Pouplin, Alison and Kaba, Sekou-Oumar and Walters, Robin and Lawrence, Hannah and Emerson, Tegan and Kvinge, Henry and Tomczak, Jakub and Jegelka, Stephanie}, volume = {251}, series = {Proceedings of Machine Learning Research}, month = {29 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v251/main/assets/zaghen24a/zaghen24a.pdf}, url = {https://proceedings.mlr.press/v251/zaghen24a.html}, abstract = {Sheaf Neural Networks (SNNs) have recently been introduced to enhance Graph Neural Networks (GNNs) in their capability to learn from graphs. Previous studies either focus on linear sheaf Laplacians or hand-crafted nonlinear sheaf Laplacians. The former are not always expressive enough in modeling complex interactions between nodes, such as antagonistic dynamics and bounded confidence dynamics, while the latter use a fixed nonlinear function that is not adapted to the data at hand. To enhance the capability of SNNs to capture complex node-to-node interactions while adapting to different scenarios, we propose a Nonlinear Sheaf Diffusion (NLSD) model, which incorporates nonlinearity into the Laplacian of SNNs through a general function learned from data. Our model is validated on a synthetic community detection dataset, where it outperforms linear SNNs and common GNN baselines in a node classification task, showcasing its ability to leverage complex network dynamics.} }
Endnote
%0 Conference Paper %T Sheaf Diffusion Goes Nonlinear: Enhancing GNNs with Adaptive Sheaf Laplacians %A Olga Zaghen %A Antonio Longa %A Steve Azzolin %A Lev Telyatnikov %A Andrea Passerini %A Pietro Liò %B Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM) %C Proceedings of Machine Learning Research %D 2024 %E Sharvaree Vadgama %E Erik Bekkers %E Alison Pouplin %E Sekou-Oumar Kaba %E Robin Walters %E Hannah Lawrence %E Tegan Emerson %E Henry Kvinge %E Jakub Tomczak %E Stephanie Jegelka %F pmlr-v251-zaghen24a %I PMLR %P 264--276 %U https://proceedings.mlr.press/v251/zaghen24a.html %V 251 %X Sheaf Neural Networks (SNNs) have recently been introduced to enhance Graph Neural Networks (GNNs) in their capability to learn from graphs. Previous studies either focus on linear sheaf Laplacians or hand-crafted nonlinear sheaf Laplacians. The former are not always expressive enough in modeling complex interactions between nodes, such as antagonistic dynamics and bounded confidence dynamics, while the latter use a fixed nonlinear function that is not adapted to the data at hand. To enhance the capability of SNNs to capture complex node-to-node interactions while adapting to different scenarios, we propose a Nonlinear Sheaf Diffusion (NLSD) model, which incorporates nonlinearity into the Laplacian of SNNs through a general function learned from data. Our model is validated on a synthetic community detection dataset, where it outperforms linear SNNs and common GNN baselines in a node classification task, showcasing its ability to leverage complex network dynamics.
APA
Zaghen, O., Longa, A., Azzolin, S., Telyatnikov, L., Passerini, A. & Liò, P.. (2024). Sheaf Diffusion Goes Nonlinear: Enhancing GNNs with Adaptive Sheaf Laplacians. Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM), in Proceedings of Machine Learning Research 251:264-276 Available from https://proceedings.mlr.press/v251/zaghen24a.html.

Related Material