Stable ResNet

Soufiane Hayou, Eugenio Clerico, Bobby He, George Deligiannidis, Arnaud Doucet, Judith Rousseau
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:1324-1332, 2021.

Abstract

Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large (Yang et al. 2017). Moreover, recent results have shown that ResNet might lose expressivity as the depth goes to infinity (Yang et al. 2017, Hayou et al. 2019). To resolve these issues, we introduce a new class of ResNet architectures, calledStable ResNet, that have the property of stabilizing the gradient while ensuring expressivity in the infinite depth limit.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-hayou21a, title = { Stable ResNet }, author = {Hayou, Soufiane and Clerico, Eugenio and He, Bobby and Deligiannidis, George and Doucet, Arnaud and Rousseau, Judith}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {1324--1332}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/hayou21a/hayou21a.pdf}, url = {https://proceedings.mlr.press/v130/hayou21a.html}, abstract = { Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large (Yang et al. 2017). Moreover, recent results have shown that ResNet might lose expressivity as the depth goes to infinity (Yang et al. 2017, Hayou et al. 2019). To resolve these issues, we introduce a new class of ResNet architectures, calledStable ResNet, that have the property of stabilizing the gradient while ensuring expressivity in the infinite depth limit. } }
Endnote
%0 Conference Paper %T Stable ResNet %A Soufiane Hayou %A Eugenio Clerico %A Bobby He %A George Deligiannidis %A Arnaud Doucet %A Judith Rousseau %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-hayou21a %I PMLR %P 1324--1332 %U https://proceedings.mlr.press/v130/hayou21a.html %V 130 %X Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large (Yang et al. 2017). Moreover, recent results have shown that ResNet might lose expressivity as the depth goes to infinity (Yang et al. 2017, Hayou et al. 2019). To resolve these issues, we introduce a new class of ResNet architectures, calledStable ResNet, that have the property of stabilizing the gradient while ensuring expressivity in the infinite depth limit.
APA
Hayou, S., Clerico, E., He, B., Deligiannidis, G., Doucet, A. & Rousseau, J.. (2021). Stable ResNet . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:1324-1332 Available from https://proceedings.mlr.press/v130/hayou21a.html.

Related Material