Edge-Splitting MLP: Node Classification on Homophilic and Heterophilic Graphs Without Message Passing

Matthias Kohn, Marcel Hoffmann, Ansgar Scherp
Proceedings of the Third Learning on Graphs Conference, PMLR 269:11:1-11:21, 2025.

Abstract

Message Passing Neural Networks (MPNNs) have demonstrated remarkable success in node classification on homophilic graphs. It has been shown that they do not solely rely on homophily but on neighborhood distributions of nodes, i.e., consistency of the neighborhood label distribution within the same class. MLP-based models do not use message passing, i.e.,Graph-MLP incorporates the neighborhood in a separate loss function. These models are faster and robust to edge noise. Graph-MLP maps adjacent nodes closer in the embedding space but is unaware of the neighborhood pattern of the labels, i.e., relies solely on homophily. Edge-Splitting GNN (ES-GNN) is a model specialized for heterophilic graphs and splits the edges into task-relevant and task-irrelevant, respectively. To mitigate the limitations of Graph-MLP on heterophilic graphs, we propose ES-MLP that combines Graph-MLP with an edge-splitting mechanism from ES-GNN. It incorporates the edge splitting into the loss of Graph-MLP to learn two separate adjacency matrices based on relevant and irrelevant feature pairs. Our experiments on seven datasets with five baselines show that ES-MLP is on par with homophilic and heterophilic models on all datasets without using edges during inference. We show that ES-MLP is robust to multiple types of edge noise during inference and its inference time is two to five times faster than commonly used MPNNs. We will make our source code available.

Cite this Paper


BibTeX
@InProceedings{pmlr-v269-kohn25a, title = {Edge-Splitting MLP: Node Classification on Homophilic and Heterophilic Graphs Without Message Passing}, author = {Kohn, Matthias and Hoffmann, Marcel and Scherp, Ansgar}, booktitle = {Proceedings of the Third Learning on Graphs Conference}, pages = {11:1--11:21}, year = {2025}, editor = {Wolf, Guy and Krishnaswamy, Smita}, volume = {269}, series = {Proceedings of Machine Learning Research}, month = {26--29 Nov}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v269/main/assets/kohn25a/kohn25a.pdf}, url = {https://proceedings.mlr.press/v269/kohn25a.html}, abstract = {Message Passing Neural Networks (MPNNs) have demonstrated remarkable success in node classification on homophilic graphs. It has been shown that they do not solely rely on homophily but on neighborhood distributions of nodes, i.e., consistency of the neighborhood label distribution within the same class. MLP-based models do not use message passing, i.e.,Graph-MLP incorporates the neighborhood in a separate loss function. These models are faster and robust to edge noise. Graph-MLP maps adjacent nodes closer in the embedding space but is unaware of the neighborhood pattern of the labels, i.e., relies solely on homophily. Edge-Splitting GNN (ES-GNN) is a model specialized for heterophilic graphs and splits the edges into task-relevant and task-irrelevant, respectively. To mitigate the limitations of Graph-MLP on heterophilic graphs, we propose ES-MLP that combines Graph-MLP with an edge-splitting mechanism from ES-GNN. It incorporates the edge splitting into the loss of Graph-MLP to learn two separate adjacency matrices based on relevant and irrelevant feature pairs. Our experiments on seven datasets with five baselines show that ES-MLP is on par with homophilic and heterophilic models on all datasets without using edges during inference. We show that ES-MLP is robust to multiple types of edge noise during inference and its inference time is two to five times faster than commonly used MPNNs. We will make our source code available.} }
Endnote
%0 Conference Paper %T Edge-Splitting MLP: Node Classification on Homophilic and Heterophilic Graphs Without Message Passing %A Matthias Kohn %A Marcel Hoffmann %A Ansgar Scherp %B Proceedings of the Third Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2025 %E Guy Wolf %E Smita Krishnaswamy %F pmlr-v269-kohn25a %I PMLR %P 11:1--11:21 %U https://proceedings.mlr.press/v269/kohn25a.html %V 269 %X Message Passing Neural Networks (MPNNs) have demonstrated remarkable success in node classification on homophilic graphs. It has been shown that they do not solely rely on homophily but on neighborhood distributions of nodes, i.e., consistency of the neighborhood label distribution within the same class. MLP-based models do not use message passing, i.e.,Graph-MLP incorporates the neighborhood in a separate loss function. These models are faster and robust to edge noise. Graph-MLP maps adjacent nodes closer in the embedding space but is unaware of the neighborhood pattern of the labels, i.e., relies solely on homophily. Edge-Splitting GNN (ES-GNN) is a model specialized for heterophilic graphs and splits the edges into task-relevant and task-irrelevant, respectively. To mitigate the limitations of Graph-MLP on heterophilic graphs, we propose ES-MLP that combines Graph-MLP with an edge-splitting mechanism from ES-GNN. It incorporates the edge splitting into the loss of Graph-MLP to learn two separate adjacency matrices based on relevant and irrelevant feature pairs. Our experiments on seven datasets with five baselines show that ES-MLP is on par with homophilic and heterophilic models on all datasets without using edges during inference. We show that ES-MLP is robust to multiple types of edge noise during inference and its inference time is two to five times faster than commonly used MPNNs. We will make our source code available.
APA
Kohn, M., Hoffmann, M. & Scherp, A.. (2025). Edge-Splitting MLP: Node Classification on Homophilic and Heterophilic Graphs Without Message Passing. Proceedings of the Third Learning on Graphs Conference, in Proceedings of Machine Learning Research 269:11:1-11:21 Available from https://proceedings.mlr.press/v269/kohn25a.html.

Related Material