Transformation of ReLU-based recurrent neural networks from discrete-time to continuous-time

Zahra Monfared, Daniel Durstewitz
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6999-7009, 2020.

Abstract

Recurrent neural networks (RNN) as used in machine learning are commonly formulated in discrete time, i.e. as recursive maps. This brings a lot of advantages for training models on data, e.g. for the purpose of time series prediction or dynamical systems identification, as powerful and efficient inference algorithms exist for discrete time systems and numerical integration of differential equations is not necessary. On the other hand, mathematical analysis of dynamical systems inferred from data is often more convenient and enables additional insights if these are formulated in continuous time, i.e. as systems of ordinary or partial differential equations (ODE/ PDE). Here we show how to perform such a translation from discrete to continuous time for a particular class of ReLU-based RNN. We prove three theorems on the mathematical equivalence between the discrete and continuous time formulations under a variety of conditions, and illustrate how to use our mathematical results on different machine learning and nonlinear dynamical systems examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-monfared20a, title = {Transformation of {R}e{LU}-based recurrent neural networks from discrete-time to continuous-time}, author = {Monfared, Zahra and Durstewitz, Daniel}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6999--7009}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/monfared20a/monfared20a.pdf}, url = {https://proceedings.mlr.press/v119/monfared20a.html}, abstract = {Recurrent neural networks (RNN) as used in machine learning are commonly formulated in discrete time, i.e. as recursive maps. This brings a lot of advantages for training models on data, e.g. for the purpose of time series prediction or dynamical systems identification, as powerful and efficient inference algorithms exist for discrete time systems and numerical integration of differential equations is not necessary. On the other hand, mathematical analysis of dynamical systems inferred from data is often more convenient and enables additional insights if these are formulated in continuous time, i.e. as systems of ordinary or partial differential equations (ODE/ PDE). Here we show how to perform such a translation from discrete to continuous time for a particular class of ReLU-based RNN. We prove three theorems on the mathematical equivalence between the discrete and continuous time formulations under a variety of conditions, and illustrate how to use our mathematical results on different machine learning and nonlinear dynamical systems examples.} }
Endnote
%0 Conference Paper %T Transformation of ReLU-based recurrent neural networks from discrete-time to continuous-time %A Zahra Monfared %A Daniel Durstewitz %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-monfared20a %I PMLR %P 6999--7009 %U https://proceedings.mlr.press/v119/monfared20a.html %V 119 %X Recurrent neural networks (RNN) as used in machine learning are commonly formulated in discrete time, i.e. as recursive maps. This brings a lot of advantages for training models on data, e.g. for the purpose of time series prediction or dynamical systems identification, as powerful and efficient inference algorithms exist for discrete time systems and numerical integration of differential equations is not necessary. On the other hand, mathematical analysis of dynamical systems inferred from data is often more convenient and enables additional insights if these are formulated in continuous time, i.e. as systems of ordinary or partial differential equations (ODE/ PDE). Here we show how to perform such a translation from discrete to continuous time for a particular class of ReLU-based RNN. We prove three theorems on the mathematical equivalence between the discrete and continuous time formulations under a variety of conditions, and illustrate how to use our mathematical results on different machine learning and nonlinear dynamical systems examples.
APA
Monfared, Z. & Durstewitz, D.. (2020). Transformation of ReLU-based recurrent neural networks from discrete-time to continuous-time. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6999-7009 Available from https://proceedings.mlr.press/v119/monfared20a.html.

Related Material