GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation

Marc Brockschmidt
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1144-1152, 2020.

Abstract

This paper presents a new Graph Neural Network (GNN) type using feature-wise linear modulation (FiLM). Many standard GNN variants propagate information along the edges of a graph by computing messages based only on the representation of the source of each edge. In GNN-FiLM, the representation of the target node of an edge is used to compute a transformation that can be applied to all incoming messages, allowing feature-wise modulation of the passed information. Different GNN architectures are compared in extensive experiments on three tasks from the literature, using re-implementations of many baseline methods. Hyperparameters for all methods were found using extensive search, yielding somewhat surprising results: differences between state of the art models are much smaller than reported in the literature and well-known simple baselines that are often not compared to perform better than recently proposed GNN variants. Nonetheless, GNN-FiLM outperforms these methods on a regression task on molecular graphs and performs competitively on other tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-brockschmidt20a, title = {{GNN}-{F}i{LM}: Graph Neural Networks with Feature-wise Linear Modulation}, author = {Brockschmidt, Marc}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1144--1152}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/brockschmidt20a/brockschmidt20a.pdf}, url = {https://proceedings.mlr.press/v119/brockschmidt20a.html}, abstract = {This paper presents a new Graph Neural Network (GNN) type using feature-wise linear modulation (FiLM). Many standard GNN variants propagate information along the edges of a graph by computing messages based only on the representation of the source of each edge. In GNN-FiLM, the representation of the target node of an edge is used to compute a transformation that can be applied to all incoming messages, allowing feature-wise modulation of the passed information. Different GNN architectures are compared in extensive experiments on three tasks from the literature, using re-implementations of many baseline methods. Hyperparameters for all methods were found using extensive search, yielding somewhat surprising results: differences between state of the art models are much smaller than reported in the literature and well-known simple baselines that are often not compared to perform better than recently proposed GNN variants. Nonetheless, GNN-FiLM outperforms these methods on a regression task on molecular graphs and performs competitively on other tasks.} }
Endnote
%0 Conference Paper %T GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation %A Marc Brockschmidt %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-brockschmidt20a %I PMLR %P 1144--1152 %U https://proceedings.mlr.press/v119/brockschmidt20a.html %V 119 %X This paper presents a new Graph Neural Network (GNN) type using feature-wise linear modulation (FiLM). Many standard GNN variants propagate information along the edges of a graph by computing messages based only on the representation of the source of each edge. In GNN-FiLM, the representation of the target node of an edge is used to compute a transformation that can be applied to all incoming messages, allowing feature-wise modulation of the passed information. Different GNN architectures are compared in extensive experiments on three tasks from the literature, using re-implementations of many baseline methods. Hyperparameters for all methods were found using extensive search, yielding somewhat surprising results: differences between state of the art models are much smaller than reported in the literature and well-known simple baselines that are often not compared to perform better than recently proposed GNN variants. Nonetheless, GNN-FiLM outperforms these methods on a regression task on molecular graphs and performs competitively on other tasks.
APA
Brockschmidt, M.. (2020). GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1144-1152 Available from https://proceedings.mlr.press/v119/brockschmidt20a.html.

Related Material