Learning Deep ResNet Blocks Sequentially using Boosting Theory

Furong Huang, Jordan Ash, John Langford, Robert Schapire
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:2058-2067, 2018.

Abstract

We prove a multi-channel telescoping sum boosting theory for the ResNet architectures which simultaneously creates a new technique for boosting over features (in contrast with labels) and provides a new algorithm for ResNet-style architectures. Our proposed training algorithm, BoostResNet, is particularly suitable in non-differentiable architectures. Our method only requires the relatively inexpensive sequential training of $T$ “shallow ResNets”. We prove that the training error decays exponentially with the depth $T$ if the weak module classifiers that we train perform slightly better than some weak baseline. In other words, we propose a weak learning condition and prove a boosting theory for ResNet under the weak learning condition. A generalization error bound based on margin theory is proved and suggests that ResNet could be resistant to overfitting using a network with $l_1$ norm bounded weights.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-huang18b, title = {Learning Deep {R}es{N}et Blocks Sequentially using Boosting Theory}, author = {Huang, Furong and Ash, Jordan and Langford, John and Schapire, Robert}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {2058--2067}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/huang18b/huang18b.pdf}, url = {http://proceedings.mlr.press/v80/huang18b.html}, abstract = {We prove a multi-channel telescoping sum boosting theory for the ResNet architectures which simultaneously creates a new technique for boosting over features (in contrast with labels) and provides a new algorithm for ResNet-style architectures. Our proposed training algorithm, BoostResNet, is particularly suitable in non-differentiable architectures. Our method only requires the relatively inexpensive sequential training of $T$ “shallow ResNets”. We prove that the training error decays exponentially with the depth $T$ if the weak module classifiers that we train perform slightly better than some weak baseline. In other words, we propose a weak learning condition and prove a boosting theory for ResNet under the weak learning condition. A generalization error bound based on margin theory is proved and suggests that ResNet could be resistant to overfitting using a network with $l_1$ norm bounded weights.} }
Endnote
%0 Conference Paper %T Learning Deep ResNet Blocks Sequentially using Boosting Theory %A Furong Huang %A Jordan Ash %A John Langford %A Robert Schapire %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-huang18b %I PMLR %P 2058--2067 %U http://proceedings.mlr.press/v80/huang18b.html %V 80 %X We prove a multi-channel telescoping sum boosting theory for the ResNet architectures which simultaneously creates a new technique for boosting over features (in contrast with labels) and provides a new algorithm for ResNet-style architectures. Our proposed training algorithm, BoostResNet, is particularly suitable in non-differentiable architectures. Our method only requires the relatively inexpensive sequential training of $T$ “shallow ResNets”. We prove that the training error decays exponentially with the depth $T$ if the weak module classifiers that we train perform slightly better than some weak baseline. In other words, we propose a weak learning condition and prove a boosting theory for ResNet under the weak learning condition. A generalization error bound based on margin theory is proved and suggests that ResNet could be resistant to overfitting using a network with $l_1$ norm bounded weights.
APA
Huang, F., Ash, J., Langford, J. & Schapire, R.. (2018). Learning Deep ResNet Blocks Sequentially using Boosting Theory. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:2058-2067 Available from http://proceedings.mlr.press/v80/huang18b.html.

Related Material