Contracting Implicit Recurrent Neural Networks: Stable Models with Improved Trainability

Max Revay, Ian Manchester
Proceedings of the 2nd Conference on Learning for Dynamics and Control, PMLR 120:393-403, 2020.

Abstract

Stability of recurrent models is closely linked with trainability, generalizability and in some applications,safety. Methods that train stable recurrent neural networks, however, do so at a significant cost to expressibility. We propose an implicit model structure that allows for a convex parametrization of stable models using contraction analysis of non-linear systems. Using these stability conditions we propose a new approach to model initialization and then provide a number of empirical results comparing the performance of our proposed model set to previous stable RNNs and vanilla RNNs. By carefully controlling stability in the model, we observe a significant increase in the speed of training and model performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v120-revay20a, title = {Contracting Implicit Recurrent Neural Networks: Stable Models with Improved Trainability}, author = {Revay, Max and Manchester, Ian}, booktitle = {Proceedings of the 2nd Conference on Learning for Dynamics and Control}, pages = {393--403}, year = {2020}, editor = {Bayen, Alexandre M. and Jadbabaie, Ali and Pappas, George and Parrilo, Pablo A. and Recht, Benjamin and Tomlin, Claire and Zeilinger, Melanie}, volume = {120}, series = {Proceedings of Machine Learning Research}, month = {10--11 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v120/revay20a/revay20a.pdf}, url = {https://proceedings.mlr.press/v120/revay20a.html}, abstract = {Stability of recurrent models is closely linked with trainability, generalizability and in some applications,safety. Methods that train stable recurrent neural networks, however, do so at a significant cost to expressibility. We propose an implicit model structure that allows for a convex parametrization of stable models using contraction analysis of non-linear systems. Using these stability conditions we propose a new approach to model initialization and then provide a number of empirical results comparing the performance of our proposed model set to previous stable RNNs and vanilla RNNs. By carefully controlling stability in the model, we observe a significant increase in the speed of training and model performance.} }
Endnote
%0 Conference Paper %T Contracting Implicit Recurrent Neural Networks: Stable Models with Improved Trainability %A Max Revay %A Ian Manchester %B Proceedings of the 2nd Conference on Learning for Dynamics and Control %C Proceedings of Machine Learning Research %D 2020 %E Alexandre M. Bayen %E Ali Jadbabaie %E George Pappas %E Pablo A. Parrilo %E Benjamin Recht %E Claire Tomlin %E Melanie Zeilinger %F pmlr-v120-revay20a %I PMLR %P 393--403 %U https://proceedings.mlr.press/v120/revay20a.html %V 120 %X Stability of recurrent models is closely linked with trainability, generalizability and in some applications,safety. Methods that train stable recurrent neural networks, however, do so at a significant cost to expressibility. We propose an implicit model structure that allows for a convex parametrization of stable models using contraction analysis of non-linear systems. Using these stability conditions we propose a new approach to model initialization and then provide a number of empirical results comparing the performance of our proposed model set to previous stable RNNs and vanilla RNNs. By carefully controlling stability in the model, we observe a significant increase in the speed of training and model performance.
APA
Revay, M. & Manchester, I.. (2020). Contracting Implicit Recurrent Neural Networks: Stable Models with Improved Trainability. Proceedings of the 2nd Conference on Learning for Dynamics and Control, in Proceedings of Machine Learning Research 120:393-403 Available from https://proceedings.mlr.press/v120/revay20a.html.

Related Material