Differentiable Dynamic Normalization for Learning Deep Representation

Ping Luo, Peng Zhanglin, Shao Wenqi, Zhang Ruimao, Ren Jiamin, Wu Lingyun
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4203-4211, 2019.

Abstract

This work presents Dynamic Normalization (DN), which is able to learn arbitrary normalization operations for different convolutional layers in a deep ConvNet. Unlike existing normalization approaches that predefined computations of the statistics (mean and variance), DN learns to estimate them. DN has several appealing benefits. First, it adapts to various networks, tasks, and batch sizes. Second, it can be easily implemented and trained in a differentiable end-to-end manner with merely small number of parameters. Third, its matrix formulation represents a wide range of normalization methods, shedding light on analyzing them theoretically. Extensive studies show that DN outperforms its counterparts in CIFAR10 and ImageNet.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-luo19a, title = {Differentiable Dynamic Normalization for Learning Deep Representation}, author = {Luo, Ping and Zhanglin, Peng and Wenqi, Shao and Ruimao, Zhang and Jiamin, Ren and Lingyun, Wu}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4203--4211}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/luo19a/luo19a.pdf}, url = {https://proceedings.mlr.press/v97/luo19a.html}, abstract = {This work presents Dynamic Normalization (DN), which is able to learn arbitrary normalization operations for different convolutional layers in a deep ConvNet. Unlike existing normalization approaches that predefined computations of the statistics (mean and variance), DN learns to estimate them. DN has several appealing benefits. First, it adapts to various networks, tasks, and batch sizes. Second, it can be easily implemented and trained in a differentiable end-to-end manner with merely small number of parameters. Third, its matrix formulation represents a wide range of normalization methods, shedding light on analyzing them theoretically. Extensive studies show that DN outperforms its counterparts in CIFAR10 and ImageNet.} }
Endnote
%0 Conference Paper %T Differentiable Dynamic Normalization for Learning Deep Representation %A Ping Luo %A Peng Zhanglin %A Shao Wenqi %A Zhang Ruimao %A Ren Jiamin %A Wu Lingyun %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-luo19a %I PMLR %P 4203--4211 %U https://proceedings.mlr.press/v97/luo19a.html %V 97 %X This work presents Dynamic Normalization (DN), which is able to learn arbitrary normalization operations for different convolutional layers in a deep ConvNet. Unlike existing normalization approaches that predefined computations of the statistics (mean and variance), DN learns to estimate them. DN has several appealing benefits. First, it adapts to various networks, tasks, and batch sizes. Second, it can be easily implemented and trained in a differentiable end-to-end manner with merely small number of parameters. Third, its matrix formulation represents a wide range of normalization methods, shedding light on analyzing them theoretically. Extensive studies show that DN outperforms its counterparts in CIFAR10 and ImageNet.
APA
Luo, P., Zhanglin, P., Wenqi, S., Ruimao, Z., Jiamin, R. & Lingyun, W.. (2019). Differentiable Dynamic Normalization for Learning Deep Representation. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4203-4211 Available from https://proceedings.mlr.press/v97/luo19a.html.

Related Material