Locally Smoothed Neural Networks

Liang Pang, Yanyan Lan, Jun Xu, Jiafeng Guo, Xueqi Cheng
Proceedings of the Ninth Asian Conference on Machine Learning, PMLR 77:177-191, 2017.

Abstract

Convolutional Neural Networks (CNN) and the locally connected layer are limited in capturing the importance and relations of different local receptive fields, which are often crucial for tasks such as face verification, visual question answering, and word sequence prediction. To tackle the issue, we propose a novel locally smoothed neural network (LSNN) in this paper. The main idea is to represent the weight matrix of the locally connected layer as the product of the kernel and the smoother, where the kernel is shared over different local receptive fields, and the smoother is for determining the importance and relations of different local receptive fields. Specifically, a multi-variate Gaussian function is utilized to generate the smoother, for modeling the location relations among different local receptive fields. Furthermore, the content information can also be leveraged by setting the mean and precision of the Gaussian function according to the content. Experiments on some variant of MNIST clearly show our advantages over CNN and locally connected layer.

Cite this Paper


BibTeX
@InProceedings{pmlr-v77-pang17a, title = {Locally Smoothed Neural Networks}, author = {Pang, Liang and Lan, Yanyan and Xu, Jun and Guo, Jiafeng and Cheng, Xueqi}, booktitle = {Proceedings of the Ninth Asian Conference on Machine Learning}, pages = {177--191}, year = {2017}, editor = {Zhang, Min-Ling and Noh, Yung-Kyun}, volume = {77}, series = {Proceedings of Machine Learning Research}, address = {Yonsei University, Seoul, Republic of Korea}, month = {15--17 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v77/pang17a/pang17a.pdf}, url = {https://proceedings.mlr.press/v77/pang17a.html}, abstract = {Convolutional Neural Networks (CNN) and the locally connected layer are limited in capturing the importance and relations of different local receptive fields, which are often crucial for tasks such as face verification, visual question answering, and word sequence prediction. To tackle the issue, we propose a novel locally smoothed neural network (LSNN) in this paper. The main idea is to represent the weight matrix of the locally connected layer as the product of the kernel and the smoother, where the kernel is shared over different local receptive fields, and the smoother is for determining the importance and relations of different local receptive fields. Specifically, a multi-variate Gaussian function is utilized to generate the smoother, for modeling the location relations among different local receptive fields. Furthermore, the content information can also be leveraged by setting the mean and precision of the Gaussian function according to the content. Experiments on some variant of MNIST clearly show our advantages over CNN and locally connected layer.} }
Endnote
%0 Conference Paper %T Locally Smoothed Neural Networks %A Liang Pang %A Yanyan Lan %A Jun Xu %A Jiafeng Guo %A Xueqi Cheng %B Proceedings of the Ninth Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Min-Ling Zhang %E Yung-Kyun Noh %F pmlr-v77-pang17a %I PMLR %P 177--191 %U https://proceedings.mlr.press/v77/pang17a.html %V 77 %X Convolutional Neural Networks (CNN) and the locally connected layer are limited in capturing the importance and relations of different local receptive fields, which are often crucial for tasks such as face verification, visual question answering, and word sequence prediction. To tackle the issue, we propose a novel locally smoothed neural network (LSNN) in this paper. The main idea is to represent the weight matrix of the locally connected layer as the product of the kernel and the smoother, where the kernel is shared over different local receptive fields, and the smoother is for determining the importance and relations of different local receptive fields. Specifically, a multi-variate Gaussian function is utilized to generate the smoother, for modeling the location relations among different local receptive fields. Furthermore, the content information can also be leveraged by setting the mean and precision of the Gaussian function according to the content. Experiments on some variant of MNIST clearly show our advantages over CNN and locally connected layer.
APA
Pang, L., Lan, Y., Xu, J., Guo, J. & Cheng, X.. (2017). Locally Smoothed Neural Networks. Proceedings of the Ninth Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 77:177-191 Available from https://proceedings.mlr.press/v77/pang17a.html.

Related Material