Exploiting Cyclic Symmetry in Convolutional Neural Networks

Sander Dieleman, Jeffrey De Fauw, Koray Kavukcuoglu
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1889-1898, 2016.

Abstract

Many classes of images exhibit rotational symmetry. Convolutional neural networks are sometimes trained using data augmentation to exploit this, but they are still required to learn the rotation equivariance properties from the data. Encoding these properties into the network architecture, as we are already used to doing for translation equivariance by using convolutional layers, could result in a more efficient use of the parameter budget by relieving the model from learning them. We introduce four operations which can be inserted into neural network models as layers, and which can be combined to make these models partially equivariant to rotations. They also enable parameter sharing across different orientations. We evaluate the effect of these architectural modifications on three datasets which exhibit rotational symmetry and demonstrate improved performance with smaller models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-dieleman16, title = {Exploiting Cyclic Symmetry in Convolutional Neural Networks}, author = {Dieleman, Sander and Fauw, Jeffrey De and Kavukcuoglu, Koray}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1889--1898}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/dieleman16.pdf}, url = {https://proceedings.mlr.press/v48/dieleman16.html}, abstract = {Many classes of images exhibit rotational symmetry. Convolutional neural networks are sometimes trained using data augmentation to exploit this, but they are still required to learn the rotation equivariance properties from the data. Encoding these properties into the network architecture, as we are already used to doing for translation equivariance by using convolutional layers, could result in a more efficient use of the parameter budget by relieving the model from learning them. We introduce four operations which can be inserted into neural network models as layers, and which can be combined to make these models partially equivariant to rotations. They also enable parameter sharing across different orientations. We evaluate the effect of these architectural modifications on three datasets which exhibit rotational symmetry and demonstrate improved performance with smaller models.} }
Endnote
%0 Conference Paper %T Exploiting Cyclic Symmetry in Convolutional Neural Networks %A Sander Dieleman %A Jeffrey De Fauw %A Koray Kavukcuoglu %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-dieleman16 %I PMLR %P 1889--1898 %U https://proceedings.mlr.press/v48/dieleman16.html %V 48 %X Many classes of images exhibit rotational symmetry. Convolutional neural networks are sometimes trained using data augmentation to exploit this, but they are still required to learn the rotation equivariance properties from the data. Encoding these properties into the network architecture, as we are already used to doing for translation equivariance by using convolutional layers, could result in a more efficient use of the parameter budget by relieving the model from learning them. We introduce four operations which can be inserted into neural network models as layers, and which can be combined to make these models partially equivariant to rotations. They also enable parameter sharing across different orientations. We evaluate the effect of these architectural modifications on three datasets which exhibit rotational symmetry and demonstrate improved performance with smaller models.
RIS
TY - CPAPER TI - Exploiting Cyclic Symmetry in Convolutional Neural Networks AU - Sander Dieleman AU - Jeffrey De Fauw AU - Koray Kavukcuoglu BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-dieleman16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1889 EP - 1898 L1 - http://proceedings.mlr.press/v48/dieleman16.pdf UR - https://proceedings.mlr.press/v48/dieleman16.html AB - Many classes of images exhibit rotational symmetry. Convolutional neural networks are sometimes trained using data augmentation to exploit this, but they are still required to learn the rotation equivariance properties from the data. Encoding these properties into the network architecture, as we are already used to doing for translation equivariance by using convolutional layers, could result in a more efficient use of the parameter budget by relieving the model from learning them. We introduce four operations which can be inserted into neural network models as layers, and which can be combined to make these models partially equivariant to rotations. They also enable parameter sharing across different orientations. We evaluate the effect of these architectural modifications on three datasets which exhibit rotational symmetry and demonstrate improved performance with smaller models. ER -
APA
Dieleman, S., Fauw, J.D. & Kavukcuoglu, K.. (2016). Exploiting Cyclic Symmetry in Convolutional Neural Networks. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1889-1898 Available from https://proceedings.mlr.press/v48/dieleman16.html.

Related Material