Primphormer: Efficient Graph Transformers with Primal Representations

Mingzhen He, Ruikai Yang, Hanling Tian, Youmei Qiu, Xiaolin Huang
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:22792-22815, 2025.

Abstract

Graph Transformers (GTs) have emerged as a promising approach for graph representation learning. Despite their successes, the quadratic complexity of GTs limits scalability on large graphs due to their pair-wise computations. To fundamentally reduce the computational burden of GTs, we propose a primal-dual framework that interprets the self-attention mechanism on graphs as a dual representation. Based on this framework, we develop Primphormer, an efficient GT that leverages a primal representation with linear complexity. Theoretical analysis reveals that Primphormer serves as a universal approximator for functions on both sequences and graphs, while also retaining its expressive power for distinguishing non-isomorphic graphs. Extensive experiments on various graph benchmarks demonstrate that Primphormer achieves competitive empirical results while maintaining a more user-friendly memory and computational costs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-he25t, title = {Primphormer: Efficient Graph Transformers with Primal Representations}, author = {He, Mingzhen and Yang, Ruikai and Tian, Hanling and Qiu, Youmei and Huang, Xiaolin}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {22792--22815}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/he25t/he25t.pdf}, url = {https://proceedings.mlr.press/v267/he25t.html}, abstract = {Graph Transformers (GTs) have emerged as a promising approach for graph representation learning. Despite their successes, the quadratic complexity of GTs limits scalability on large graphs due to their pair-wise computations. To fundamentally reduce the computational burden of GTs, we propose a primal-dual framework that interprets the self-attention mechanism on graphs as a dual representation. Based on this framework, we develop Primphormer, an efficient GT that leverages a primal representation with linear complexity. Theoretical analysis reveals that Primphormer serves as a universal approximator for functions on both sequences and graphs, while also retaining its expressive power for distinguishing non-isomorphic graphs. Extensive experiments on various graph benchmarks demonstrate that Primphormer achieves competitive empirical results while maintaining a more user-friendly memory and computational costs.} }
Endnote
%0 Conference Paper %T Primphormer: Efficient Graph Transformers with Primal Representations %A Mingzhen He %A Ruikai Yang %A Hanling Tian %A Youmei Qiu %A Xiaolin Huang %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-he25t %I PMLR %P 22792--22815 %U https://proceedings.mlr.press/v267/he25t.html %V 267 %X Graph Transformers (GTs) have emerged as a promising approach for graph representation learning. Despite their successes, the quadratic complexity of GTs limits scalability on large graphs due to their pair-wise computations. To fundamentally reduce the computational burden of GTs, we propose a primal-dual framework that interprets the self-attention mechanism on graphs as a dual representation. Based on this framework, we develop Primphormer, an efficient GT that leverages a primal representation with linear complexity. Theoretical analysis reveals that Primphormer serves as a universal approximator for functions on both sequences and graphs, while also retaining its expressive power for distinguishing non-isomorphic graphs. Extensive experiments on various graph benchmarks demonstrate that Primphormer achieves competitive empirical results while maintaining a more user-friendly memory and computational costs.
APA
He, M., Yang, R., Tian, H., Qiu, Y. & Huang, X.. (2025). Primphormer: Efficient Graph Transformers with Primal Representations. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:22792-22815 Available from https://proceedings.mlr.press/v267/he25t.html.

Related Material