Principled Simplicial Neural Networks for Trajectory Prediction

T. Mitchell Roddenberry, Nicholas Glaze, Santiago Segarra
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:9020-9029, 2021.

Abstract

We consider the construction of neural network architectures for data on simplicial complexes. In studying maps on the chain complex of a simplicial complex, we define three desirable properties of a simplicial neural network architecture: namely, permutation equivariance, orientation equivariance, and simplicial awareness. The first two properties respectively account for the fact that the node indexing and the simplex orientations in a simplicial complex are arbitrary. The last property encodes the desirable feature that the output of the neural network depends on the entire simplicial complex and not on a subset of its dimensions. Based on these properties, we propose a simple convolutional architecture, rooted in tools from algebraic topology, for the problem of trajectory prediction, and show that it obeys all three of these properties when an odd, nonlinear activation function is used. We then demonstrate the effectiveness of this architecture in extrapolating trajectories on synthetic and real datasets, with particular emphasis on the gains in generalizability to unseen trajectories.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-roddenberry21a, title = {Principled Simplicial Neural Networks for Trajectory Prediction}, author = {Roddenberry, T. Mitchell and Glaze, Nicholas and Segarra, Santiago}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {9020--9029}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/roddenberry21a/roddenberry21a.pdf}, url = {https://proceedings.mlr.press/v139/roddenberry21a.html}, abstract = {We consider the construction of neural network architectures for data on simplicial complexes. In studying maps on the chain complex of a simplicial complex, we define three desirable properties of a simplicial neural network architecture: namely, permutation equivariance, orientation equivariance, and simplicial awareness. The first two properties respectively account for the fact that the node indexing and the simplex orientations in a simplicial complex are arbitrary. The last property encodes the desirable feature that the output of the neural network depends on the entire simplicial complex and not on a subset of its dimensions. Based on these properties, we propose a simple convolutional architecture, rooted in tools from algebraic topology, for the problem of trajectory prediction, and show that it obeys all three of these properties when an odd, nonlinear activation function is used. We then demonstrate the effectiveness of this architecture in extrapolating trajectories on synthetic and real datasets, with particular emphasis on the gains in generalizability to unseen trajectories.} }
Endnote
%0 Conference Paper %T Principled Simplicial Neural Networks for Trajectory Prediction %A T. Mitchell Roddenberry %A Nicholas Glaze %A Santiago Segarra %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-roddenberry21a %I PMLR %P 9020--9029 %U https://proceedings.mlr.press/v139/roddenberry21a.html %V 139 %X We consider the construction of neural network architectures for data on simplicial complexes. In studying maps on the chain complex of a simplicial complex, we define three desirable properties of a simplicial neural network architecture: namely, permutation equivariance, orientation equivariance, and simplicial awareness. The first two properties respectively account for the fact that the node indexing and the simplex orientations in a simplicial complex are arbitrary. The last property encodes the desirable feature that the output of the neural network depends on the entire simplicial complex and not on a subset of its dimensions. Based on these properties, we propose a simple convolutional architecture, rooted in tools from algebraic topology, for the problem of trajectory prediction, and show that it obeys all three of these properties when an odd, nonlinear activation function is used. We then demonstrate the effectiveness of this architecture in extrapolating trajectories on synthetic and real datasets, with particular emphasis on the gains in generalizability to unseen trajectories.
APA
Roddenberry, T.M., Glaze, N. & Segarra, S.. (2021). Principled Simplicial Neural Networks for Trajectory Prediction. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:9020-9029 Available from https://proceedings.mlr.press/v139/roddenberry21a.html.

Related Material