Alignment of MPNNs and Graph Transformers

Bao Nguyen, Anjana Yodaiken, Petar Veličković
Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM), PMLR 251:35-49, 2024.

Abstract

As the complexity of machine learning (ML) model architectures increases, it is important to understand to what degree simpler and more efficient architectures can align with their complex counterparts. In this paper, we investigate the degree to which a Message Passing Neural Network (MPNN) can operate similarly to a Graph Transformer. We do this by training an MPNN to align with the intermediate embeddings of a Relational Transformer (RT). Throughout this process, we explore variations of the standard MPNN and assess the impact of different components on the degree of alignment. Our findings suggest that an MPNN can align to RT and the most important components that affect the alignment are the MPNN’s permutation invariant aggregation function, virtual node and layer normalisation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v251-nguyen24a, title = {Alignment of MPNNs and Graph Transformers}, author = {Nguyen, Bao and Yodaiken, Anjana and Veli\v{c}kovi\'{c}, Petar}, booktitle = {Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM)}, pages = {35--49}, year = {2024}, editor = {Vadgama, Sharvaree and Bekkers, Erik and Pouplin, Alison and Kaba, Sekou-Oumar and Walters, Robin and Lawrence, Hannah and Emerson, Tegan and Kvinge, Henry and Tomczak, Jakub and Jegelka, Stephanie}, volume = {251}, series = {Proceedings of Machine Learning Research}, month = {29 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v251/main/assets/nguyen24a/nguyen24a.pdf}, url = {https://proceedings.mlr.press/v251/nguyen24a.html}, abstract = {As the complexity of machine learning (ML) model architectures increases, it is important to understand to what degree simpler and more efficient architectures can align with their complex counterparts. In this paper, we investigate the degree to which a Message Passing Neural Network (MPNN) can operate similarly to a Graph Transformer. We do this by training an MPNN to align with the intermediate embeddings of a Relational Transformer (RT). Throughout this process, we explore variations of the standard MPNN and assess the impact of different components on the degree of alignment. Our findings suggest that an MPNN can align to RT and the most important components that affect the alignment are the MPNN’s permutation invariant aggregation function, virtual node and layer normalisation.} }
Endnote
%0 Conference Paper %T Alignment of MPNNs and Graph Transformers %A Bao Nguyen %A Anjana Yodaiken %A Petar Veličković %B Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM) %C Proceedings of Machine Learning Research %D 2024 %E Sharvaree Vadgama %E Erik Bekkers %E Alison Pouplin %E Sekou-Oumar Kaba %E Robin Walters %E Hannah Lawrence %E Tegan Emerson %E Henry Kvinge %E Jakub Tomczak %E Stephanie Jegelka %F pmlr-v251-nguyen24a %I PMLR %P 35--49 %U https://proceedings.mlr.press/v251/nguyen24a.html %V 251 %X As the complexity of machine learning (ML) model architectures increases, it is important to understand to what degree simpler and more efficient architectures can align with their complex counterparts. In this paper, we investigate the degree to which a Message Passing Neural Network (MPNN) can operate similarly to a Graph Transformer. We do this by training an MPNN to align with the intermediate embeddings of a Relational Transformer (RT). Throughout this process, we explore variations of the standard MPNN and assess the impact of different components on the degree of alignment. Our findings suggest that an MPNN can align to RT and the most important components that affect the alignment are the MPNN’s permutation invariant aggregation function, virtual node and layer normalisation.
APA
Nguyen, B., Yodaiken, A. & Veličković, P.. (2024). Alignment of MPNNs and Graph Transformers. Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM), in Proceedings of Machine Learning Research 251:35-49 Available from https://proceedings.mlr.press/v251/nguyen24a.html.

Related Material