Training CNNs with Selective Allocation of Channels

Jongheon Jeong, Jinwoo Shin
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:3080-3090, 2019.

Abstract

Recent progress in deep convolutional neural networks (CNNs) have enabled a simple paradigm of architecture design: larger models typically achieve better accuracy. Due to this, in modern CNN architectures, it becomes more important to design models that generalize well under certain resource constraints, e.g. the number of parameters. In this paper, we propose a simple way to improve the capacity of any CNN model having large-scale features, without adding more parameters. In particular, we modify a standard convolutional layer to have a new functionality of channel-selectivity, so that the layer is trained to select important channels to re-distribute their parameters. Our experimental results under various CNN architectures and datasets demonstrate that the proposed new convolutional layer allows new optima that generalize better via efficient resource utilization, compared to the baseline.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-jeong19c, title = {Training {CNN}s with Selective Allocation of Channels}, author = {Jeong, Jongheon and Shin, Jinwoo}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {3080--3090}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/jeong19c/jeong19c.pdf}, url = {https://proceedings.mlr.press/v97/jeong19c.html}, abstract = {Recent progress in deep convolutional neural networks (CNNs) have enabled a simple paradigm of architecture design: larger models typically achieve better accuracy. Due to this, in modern CNN architectures, it becomes more important to design models that generalize well under certain resource constraints, e.g. the number of parameters. In this paper, we propose a simple way to improve the capacity of any CNN model having large-scale features, without adding more parameters. In particular, we modify a standard convolutional layer to have a new functionality of channel-selectivity, so that the layer is trained to select important channels to re-distribute their parameters. Our experimental results under various CNN architectures and datasets demonstrate that the proposed new convolutional layer allows new optima that generalize better via efficient resource utilization, compared to the baseline.} }
Endnote
%0 Conference Paper %T Training CNNs with Selective Allocation of Channels %A Jongheon Jeong %A Jinwoo Shin %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-jeong19c %I PMLR %P 3080--3090 %U https://proceedings.mlr.press/v97/jeong19c.html %V 97 %X Recent progress in deep convolutional neural networks (CNNs) have enabled a simple paradigm of architecture design: larger models typically achieve better accuracy. Due to this, in modern CNN architectures, it becomes more important to design models that generalize well under certain resource constraints, e.g. the number of parameters. In this paper, we propose a simple way to improve the capacity of any CNN model having large-scale features, without adding more parameters. In particular, we modify a standard convolutional layer to have a new functionality of channel-selectivity, so that the layer is trained to select important channels to re-distribute their parameters. Our experimental results under various CNN architectures and datasets demonstrate that the proposed new convolutional layer allows new optima that generalize better via efficient resource utilization, compared to the baseline.
APA
Jeong, J. & Shin, J.. (2019). Training CNNs with Selective Allocation of Channels. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:3080-3090 Available from https://proceedings.mlr.press/v97/jeong19c.html.

Related Material