Regularizing Neural Networks via Stochastic Branch Layers

Wonpyo Park, Paul Hongsuck Seo, Bohyung Han, Minsu Cho
Proceedings of The Eleventh Asian Conference on Machine Learning, PMLR 101:678-693, 2019.

Abstract

We introduce a novel stochastic regularization technique for deep neural networks, which decomposes a layer into multiple branches with different parameters and merges stochastically sampled combinations of the outputs from the branches during training. Since the factorized branches can collapse into a single branch through a linear operation, inference requires no additional complexity compared to the ordinary layers. The proposed regularization method, referred to as StochasticBranch, is applicable to any linear layers such as fully-connected or convolution layers. The proposed regularizer allows the model to explore diverse regions of the model parameter space via multiple combinations of branches to find better local minima. An extensive set of experiments shows that our method effectively regularizes networks and further improves the generalization performance when used together with other existing regularization techniques.

Cite this Paper


BibTeX
@InProceedings{pmlr-v101-park19a, title = {Regularizing Neural Networks via Stochastic Branch Layers}, author = {Park, Wonpyo and Seo, Paul Hongsuck and Han, Bohyung and Cho, Minsu}, booktitle = {Proceedings of The Eleventh Asian Conference on Machine Learning}, pages = {678--693}, year = {2019}, editor = {Lee, Wee Sun and Suzuki, Taiji}, volume = {101}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v101/park19a/park19a.pdf}, url = {https://proceedings.mlr.press/v101/park19a.html}, abstract = {We introduce a novel stochastic regularization technique for deep neural networks, which decomposes a layer into multiple branches with different parameters and merges stochastically sampled combinations of the outputs from the branches during training. Since the factorized branches can collapse into a single branch through a linear operation, inference requires no additional complexity compared to the ordinary layers. The proposed regularization method, referred to as StochasticBranch, is applicable to any linear layers such as fully-connected or convolution layers. The proposed regularizer allows the model to explore diverse regions of the model parameter space via multiple combinations of branches to find better local minima. An extensive set of experiments shows that our method effectively regularizes networks and further improves the generalization performance when used together with other existing regularization techniques.} }
Endnote
%0 Conference Paper %T Regularizing Neural Networks via Stochastic Branch Layers %A Wonpyo Park %A Paul Hongsuck Seo %A Bohyung Han %A Minsu Cho %B Proceedings of The Eleventh Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Wee Sun Lee %E Taiji Suzuki %F pmlr-v101-park19a %I PMLR %P 678--693 %U https://proceedings.mlr.press/v101/park19a.html %V 101 %X We introduce a novel stochastic regularization technique for deep neural networks, which decomposes a layer into multiple branches with different parameters and merges stochastically sampled combinations of the outputs from the branches during training. Since the factorized branches can collapse into a single branch through a linear operation, inference requires no additional complexity compared to the ordinary layers. The proposed regularization method, referred to as StochasticBranch, is applicable to any linear layers such as fully-connected or convolution layers. The proposed regularizer allows the model to explore diverse regions of the model parameter space via multiple combinations of branches to find better local minima. An extensive set of experiments shows that our method effectively regularizes networks and further improves the generalization performance when used together with other existing regularization techniques.
APA
Park, W., Seo, P.H., Han, B. & Cho, M.. (2019). Regularizing Neural Networks via Stochastic Branch Layers. Proceedings of The Eleventh Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 101:678-693 Available from https://proceedings.mlr.press/v101/park19a.html.

Related Material