Graph Element Networks: adaptive, structured computation and memory

Ferran Alet, Adarsh Keshav Jeewajee, Maria Bauza Villalonga, Alberto Rodriguez, Tomas Lozano-Perez, Leslie Kaelbling
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:212-222, 2019.

Abstract

We explore the use of graph neural networks (GNNs) to model spatial processes in which there is no a priori graphical structure. Similar to finite element analysis, we assign nodes of a GNN to spatial locations and use a computational process defined on the graph to model the relationship between an initial function defined over a space and a resulting function in the same space. We use GNNs as a computational substrate, and show that the locations of the nodes in space as well as their connectivity can be optimized to focus on the most complex parts of the space. Moreover, this representational strategy allows the learned input-output relationship to generalize over the size of the underlying space and run the same model at different levels of precision, trading computation for accuracy. We demonstrate this method on a traditional PDE problem, a physical prediction problem from robotics, and learning to predict scene images from novel viewpoints.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-alet19a, title = {Graph Element Networks: adaptive, structured computation and memory}, author = {Alet, Ferran and Jeewajee, Adarsh Keshav and Villalonga, Maria Bauza and Rodriguez, Alberto and Lozano-Perez, Tomas and Kaelbling, Leslie}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {212--222}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/alet19a/alet19a.pdf}, url = {https://proceedings.mlr.press/v97/alet19a.html}, abstract = {We explore the use of graph neural networks (GNNs) to model spatial processes in which there is no a priori graphical structure. Similar to finite element analysis, we assign nodes of a GNN to spatial locations and use a computational process defined on the graph to model the relationship between an initial function defined over a space and a resulting function in the same space. We use GNNs as a computational substrate, and show that the locations of the nodes in space as well as their connectivity can be optimized to focus on the most complex parts of the space. Moreover, this representational strategy allows the learned input-output relationship to generalize over the size of the underlying space and run the same model at different levels of precision, trading computation for accuracy. We demonstrate this method on a traditional PDE problem, a physical prediction problem from robotics, and learning to predict scene images from novel viewpoints.} }
Endnote
%0 Conference Paper %T Graph Element Networks: adaptive, structured computation and memory %A Ferran Alet %A Adarsh Keshav Jeewajee %A Maria Bauza Villalonga %A Alberto Rodriguez %A Tomas Lozano-Perez %A Leslie Kaelbling %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-alet19a %I PMLR %P 212--222 %U https://proceedings.mlr.press/v97/alet19a.html %V 97 %X We explore the use of graph neural networks (GNNs) to model spatial processes in which there is no a priori graphical structure. Similar to finite element analysis, we assign nodes of a GNN to spatial locations and use a computational process defined on the graph to model the relationship between an initial function defined over a space and a resulting function in the same space. We use GNNs as a computational substrate, and show that the locations of the nodes in space as well as their connectivity can be optimized to focus on the most complex parts of the space. Moreover, this representational strategy allows the learned input-output relationship to generalize over the size of the underlying space and run the same model at different levels of precision, trading computation for accuracy. We demonstrate this method on a traditional PDE problem, a physical prediction problem from robotics, and learning to predict scene images from novel viewpoints.
APA
Alet, F., Jeewajee, A.K., Villalonga, M.B., Rodriguez, A., Lozano-Perez, T. & Kaelbling, L.. (2019). Graph Element Networks: adaptive, structured computation and memory. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:212-222 Available from https://proceedings.mlr.press/v97/alet19a.html.

Related Material