Size-Invariant Graph Representations for Graph Classification Extrapolations

Beatrice Bevilacqua, Yangze Zhou, Bruno Ribeiro
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:837-851, 2021.

Abstract

In general, graph representation learning methods assume that the train and test data come from the same distribution. In this work we consider an underexplored area of an otherwise rapidly developing field of graph representation learning: The task of out-of-distribution (OOD) graph classification, where train and test data have different distributions, with test data unavailable during training. Our work shows it is possible to use a causal model to learn approximately invariant representations that better extrapolate between train and test data. Finally, we conclude with synthetic and real-world dataset experiments showcasing the benefits of representations that are invariant to train/test distribution shifts.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-bevilacqua21a, title = {Size-Invariant Graph Representations for Graph Classification Extrapolations}, author = {Bevilacqua, Beatrice and Zhou, Yangze and Ribeiro, Bruno}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {837--851}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/bevilacqua21a/bevilacqua21a.pdf}, url = {https://proceedings.mlr.press/v139/bevilacqua21a.html}, abstract = {In general, graph representation learning methods assume that the train and test data come from the same distribution. In this work we consider an underexplored area of an otherwise rapidly developing field of graph representation learning: The task of out-of-distribution (OOD) graph classification, where train and test data have different distributions, with test data unavailable during training. Our work shows it is possible to use a causal model to learn approximately invariant representations that better extrapolate between train and test data. Finally, we conclude with synthetic and real-world dataset experiments showcasing the benefits of representations that are invariant to train/test distribution shifts.} }
Endnote
%0 Conference Paper %T Size-Invariant Graph Representations for Graph Classification Extrapolations %A Beatrice Bevilacqua %A Yangze Zhou %A Bruno Ribeiro %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-bevilacqua21a %I PMLR %P 837--851 %U https://proceedings.mlr.press/v139/bevilacqua21a.html %V 139 %X In general, graph representation learning methods assume that the train and test data come from the same distribution. In this work we consider an underexplored area of an otherwise rapidly developing field of graph representation learning: The task of out-of-distribution (OOD) graph classification, where train and test data have different distributions, with test data unavailable during training. Our work shows it is possible to use a causal model to learn approximately invariant representations that better extrapolate between train and test data. Finally, we conclude with synthetic and real-world dataset experiments showcasing the benefits of representations that are invariant to train/test distribution shifts.
APA
Bevilacqua, B., Zhou, Y. & Ribeiro, B.. (2021). Size-Invariant Graph Representations for Graph Classification Extrapolations. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:837-851 Available from https://proceedings.mlr.press/v139/bevilacqua21a.html.

Related Material