Zero-Shot Generalization of GNNs over Distinct Attribute Domains

Yangyi Shen, Jincheng Zhou, Beatrice Bevilacqua, Joshua Robinson, Charilaos Kanatsoulis, Jure Leskovec, Bruno Ribeiro
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:54701-54731, 2025.

Abstract

Traditional Graph Neural Networks (GNNs) cannot generalize to new graphs with node attributes different from the training ones, making zero-shot generalization across different node attribute domains an open challenge in graph machine learning. In this paper, we propose STAGE, which encodes statistical dependencies between attributes rather than individual attribute values, which may differ in test graphs. By assuming these dependencies remain invariant under changes in node attributes, STAGE achieves provable generalization guarantees for a family of domain shifts. Empirically, STAGE demonstrates strong zero-shot performance on medium-sized datasets: when trained on multiple graph datasets with different attribute spaces (varying in types and number) and evaluated on graphs with entirely new attributes, STAGE achieves a relative improvement in Hits@1 between 40% to 103% in link prediction and a 10% improvement in node classification compared to state-of-the-art baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-shen25p, title = {Zero-Shot Generalization of {GNN}s over Distinct Attribute Domains}, author = {Shen, Yangyi and Zhou, Jincheng and Bevilacqua, Beatrice and Robinson, Joshua and Kanatsoulis, Charilaos and Leskovec, Jure and Ribeiro, Bruno}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {54701--54731}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/shen25p/shen25p.pdf}, url = {https://proceedings.mlr.press/v267/shen25p.html}, abstract = {Traditional Graph Neural Networks (GNNs) cannot generalize to new graphs with node attributes different from the training ones, making zero-shot generalization across different node attribute domains an open challenge in graph machine learning. In this paper, we propose STAGE, which encodes statistical dependencies between attributes rather than individual attribute values, which may differ in test graphs. By assuming these dependencies remain invariant under changes in node attributes, STAGE achieves provable generalization guarantees for a family of domain shifts. Empirically, STAGE demonstrates strong zero-shot performance on medium-sized datasets: when trained on multiple graph datasets with different attribute spaces (varying in types and number) and evaluated on graphs with entirely new attributes, STAGE achieves a relative improvement in Hits@1 between 40% to 103% in link prediction and a 10% improvement in node classification compared to state-of-the-art baselines.} }
Endnote
%0 Conference Paper %T Zero-Shot Generalization of GNNs over Distinct Attribute Domains %A Yangyi Shen %A Jincheng Zhou %A Beatrice Bevilacqua %A Joshua Robinson %A Charilaos Kanatsoulis %A Jure Leskovec %A Bruno Ribeiro %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-shen25p %I PMLR %P 54701--54731 %U https://proceedings.mlr.press/v267/shen25p.html %V 267 %X Traditional Graph Neural Networks (GNNs) cannot generalize to new graphs with node attributes different from the training ones, making zero-shot generalization across different node attribute domains an open challenge in graph machine learning. In this paper, we propose STAGE, which encodes statistical dependencies between attributes rather than individual attribute values, which may differ in test graphs. By assuming these dependencies remain invariant under changes in node attributes, STAGE achieves provable generalization guarantees for a family of domain shifts. Empirically, STAGE demonstrates strong zero-shot performance on medium-sized datasets: when trained on multiple graph datasets with different attribute spaces (varying in types and number) and evaluated on graphs with entirely new attributes, STAGE achieves a relative improvement in Hits@1 between 40% to 103% in link prediction and a 10% improvement in node classification compared to state-of-the-art baselines.
APA
Shen, Y., Zhou, J., Bevilacqua, B., Robinson, J., Kanatsoulis, C., Leskovec, J. & Ribeiro, B.. (2025). Zero-Shot Generalization of GNNs over Distinct Attribute Domains. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:54701-54731 Available from https://proceedings.mlr.press/v267/shen25p.html.

Related Material