Universal Simulation of Stable Dynamical Systems by Recurrent Neural Nets

Joshua Hanson, Maxim Raginsky
Proceedings of the 2nd Conference on Learning for Dynamics and Control, PMLR 120:384-392, 2020.

Abstract

It is well-known that continuous-time recurrent neural nets are universal approximators for continuous-time dynamical systems. However, existing results provide approximation guarantees only for finite-time trajectories. In this work, we show that infinite-time trajectories generated by dynamical systems that are stable in a certain sense can be reproduced arbitrarily accurately by recurrent neural nets. For a subclass of these stable systems, we provide quantitative estimates on the sufficient number of neurons needed to achieve a specified error tolerance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v120-hanson20a, title = {Universal Simulation of Stable Dynamical Systems by Recurrent Neural Nets}, author = {Hanson, Joshua and Raginsky, Maxim}, booktitle = {Proceedings of the 2nd Conference on Learning for Dynamics and Control}, pages = {384--392}, year = {2020}, editor = {Bayen, Alexandre M. and Jadbabaie, Ali and Pappas, George and Parrilo, Pablo A. and Recht, Benjamin and Tomlin, Claire and Zeilinger, Melanie}, volume = {120}, series = {Proceedings of Machine Learning Research}, month = {10--11 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v120/hanson20a/hanson20a.pdf}, url = {https://proceedings.mlr.press/v120/hanson20a.html}, abstract = {It is well-known that continuous-time recurrent neural nets are universal approximators for continuous-time dynamical systems. However, existing results provide approximation guarantees only for finite-time trajectories. In this work, we show that infinite-time trajectories generated by dynamical systems that are stable in a certain sense can be reproduced arbitrarily accurately by recurrent neural nets. For a subclass of these stable systems, we provide quantitative estimates on the sufficient number of neurons needed to achieve a specified error tolerance. } }
Endnote
%0 Conference Paper %T Universal Simulation of Stable Dynamical Systems by Recurrent Neural Nets %A Joshua Hanson %A Maxim Raginsky %B Proceedings of the 2nd Conference on Learning for Dynamics and Control %C Proceedings of Machine Learning Research %D 2020 %E Alexandre M. Bayen %E Ali Jadbabaie %E George Pappas %E Pablo A. Parrilo %E Benjamin Recht %E Claire Tomlin %E Melanie Zeilinger %F pmlr-v120-hanson20a %I PMLR %P 384--392 %U https://proceedings.mlr.press/v120/hanson20a.html %V 120 %X It is well-known that continuous-time recurrent neural nets are universal approximators for continuous-time dynamical systems. However, existing results provide approximation guarantees only for finite-time trajectories. In this work, we show that infinite-time trajectories generated by dynamical systems that are stable in a certain sense can be reproduced arbitrarily accurately by recurrent neural nets. For a subclass of these stable systems, we provide quantitative estimates on the sufficient number of neurons needed to achieve a specified error tolerance.
APA
Hanson, J. & Raginsky, M.. (2020). Universal Simulation of Stable Dynamical Systems by Recurrent Neural Nets. Proceedings of the 2nd Conference on Learning for Dynamics and Control, in Proceedings of Machine Learning Research 120:384-392 Available from https://proceedings.mlr.press/v120/hanson20a.html.

Related Material