Symplectic Momentum Neural Networks - Using Discrete Variational Mechanics as a prior in Deep Learning

Saul Santos, Monica Ekal, Rodrigo Ventura
Proceedings of The 4th Annual Learning for Dynamics and Control Conference, PMLR 168:584-595, 2022.

Abstract

With deep learning being gaining increase from the research community for prediction and control of real physical systems, learning important representations is becoming now more than ever mandatory. It is of extreme importance that deep learning representations are coherent with physics. When learning from discrete data this can be guaranteed by including some sort of prior into the learning, however not all discretization priors preserve important structures from the physics. In this paper we introduce Symplectic Momentum Neural Networks (SyMo) as models from a discrete formulation of mechanics for non-separable mechanical systems. The combination of such formulation leads SyMos to be constrained towards preserving important geometric structures such as momentum and a symplectic form and learn from limited data. Furthermore, it allows to learn dynamics only from the poses as training data. We extend SyMos to include variational integrators within the learning framework by developing an implicit root-find layer which leads to End-to-End Symplectic Momentum Neural Networks (E2E-SyMo). Through experimental results, using the pendulum and cartpole we show that such combination not only allows these models to learn from limited data but also provides the models with the capability of preserving the symplectic form and show better long-term behaviour.

Cite this Paper


BibTeX
@InProceedings{pmlr-v168-santos22a, title = {Symplectic Momentum Neural Networks - Using Discrete Variational Mechanics as a prior in Deep Learning}, author = {Santos, Saul and Ekal, Monica and Ventura, Rodrigo}, booktitle = {Proceedings of The 4th Annual Learning for Dynamics and Control Conference}, pages = {584--595}, year = {2022}, editor = {Firoozi, Roya and Mehr, Negar and Yel, Esen and Antonova, Rika and Bohg, Jeannette and Schwager, Mac and Kochenderfer, Mykel}, volume = {168}, series = {Proceedings of Machine Learning Research}, month = {23--24 Jun}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v168/santos22a/santos22a.pdf}, url = {https://proceedings.mlr.press/v168/santos22a.html}, abstract = {With deep learning being gaining increase from the research community for prediction and control of real physical systems, learning important representations is becoming now more than ever mandatory. It is of extreme importance that deep learning representations are coherent with physics. When learning from discrete data this can be guaranteed by including some sort of prior into the learning, however not all discretization priors preserve important structures from the physics. In this paper we introduce Symplectic Momentum Neural Networks (SyMo) as models from a discrete formulation of mechanics for non-separable mechanical systems. The combination of such formulation leads SyMos to be constrained towards preserving important geometric structures such as momentum and a symplectic form and learn from limited data. Furthermore, it allows to learn dynamics only from the poses as training data. We extend SyMos to include variational integrators within the learning framework by developing an implicit root-find layer which leads to End-to-End Symplectic Momentum Neural Networks (E2E-SyMo). Through experimental results, using the pendulum and cartpole we show that such combination not only allows these models to learn from limited data but also provides the models with the capability of preserving the symplectic form and show better long-term behaviour.} }
Endnote
%0 Conference Paper %T Symplectic Momentum Neural Networks - Using Discrete Variational Mechanics as a prior in Deep Learning %A Saul Santos %A Monica Ekal %A Rodrigo Ventura %B Proceedings of The 4th Annual Learning for Dynamics and Control Conference %C Proceedings of Machine Learning Research %D 2022 %E Roya Firoozi %E Negar Mehr %E Esen Yel %E Rika Antonova %E Jeannette Bohg %E Mac Schwager %E Mykel Kochenderfer %F pmlr-v168-santos22a %I PMLR %P 584--595 %U https://proceedings.mlr.press/v168/santos22a.html %V 168 %X With deep learning being gaining increase from the research community for prediction and control of real physical systems, learning important representations is becoming now more than ever mandatory. It is of extreme importance that deep learning representations are coherent with physics. When learning from discrete data this can be guaranteed by including some sort of prior into the learning, however not all discretization priors preserve important structures from the physics. In this paper we introduce Symplectic Momentum Neural Networks (SyMo) as models from a discrete formulation of mechanics for non-separable mechanical systems. The combination of such formulation leads SyMos to be constrained towards preserving important geometric structures such as momentum and a symplectic form and learn from limited data. Furthermore, it allows to learn dynamics only from the poses as training data. We extend SyMos to include variational integrators within the learning framework by developing an implicit root-find layer which leads to End-to-End Symplectic Momentum Neural Networks (E2E-SyMo). Through experimental results, using the pendulum and cartpole we show that such combination not only allows these models to learn from limited data but also provides the models with the capability of preserving the symplectic form and show better long-term behaviour.
APA
Santos, S., Ekal, M. & Ventura, R.. (2022). Symplectic Momentum Neural Networks - Using Discrete Variational Mechanics as a prior in Deep Learning. Proceedings of The 4th Annual Learning for Dynamics and Control Conference, in Proceedings of Machine Learning Research 168:584-595 Available from https://proceedings.mlr.press/v168/santos22a.html.

Related Material