Group Equivariant Convolutional Networks

Taco Cohen, Max Welling
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:2990-2999, 2016.

Abstract

We introduce Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries. G-CNNs use G-convolutions, a new type of layer that enjoys a substantially higher degree of weight sharing than regular convolution layers. G-convolutions increase the expressive capacity of the network without increasing the number of parameters. Group convolution layers are easy to use and can be implemented with negligible computational overhead for discrete groups generated by translations, reflections and rotations. G-CNNs achieve state of the art results on CIFAR10 and rotated MNIST.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-cohenc16, title = {Group Equivariant Convolutional Networks}, author = {Cohen, Taco and Welling, Max}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {2990--2999}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/cohenc16.pdf}, url = {https://proceedings.mlr.press/v48/cohenc16.html}, abstract = {We introduce Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries. G-CNNs use G-convolutions, a new type of layer that enjoys a substantially higher degree of weight sharing than regular convolution layers. G-convolutions increase the expressive capacity of the network without increasing the number of parameters. Group convolution layers are easy to use and can be implemented with negligible computational overhead for discrete groups generated by translations, reflections and rotations. G-CNNs achieve state of the art results on CIFAR10 and rotated MNIST.} }
Endnote
%0 Conference Paper %T Group Equivariant Convolutional Networks %A Taco Cohen %A Max Welling %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-cohenc16 %I PMLR %P 2990--2999 %U https://proceedings.mlr.press/v48/cohenc16.html %V 48 %X We introduce Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries. G-CNNs use G-convolutions, a new type of layer that enjoys a substantially higher degree of weight sharing than regular convolution layers. G-convolutions increase the expressive capacity of the network without increasing the number of parameters. Group convolution layers are easy to use and can be implemented with negligible computational overhead for discrete groups generated by translations, reflections and rotations. G-CNNs achieve state of the art results on CIFAR10 and rotated MNIST.
RIS
TY - CPAPER TI - Group Equivariant Convolutional Networks AU - Taco Cohen AU - Max Welling BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-cohenc16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 2990 EP - 2999 L1 - http://proceedings.mlr.press/v48/cohenc16.pdf UR - https://proceedings.mlr.press/v48/cohenc16.html AB - We introduce Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries. G-CNNs use G-convolutions, a new type of layer that enjoys a substantially higher degree of weight sharing than regular convolution layers. G-convolutions increase the expressive capacity of the network without increasing the number of parameters. Group convolution layers are easy to use and can be implemented with negligible computational overhead for discrete groups generated by translations, reflections and rotations. G-CNNs achieve state of the art results on CIFAR10 and rotated MNIST. ER -
APA
Cohen, T. & Welling, M.. (2016). Group Equivariant Convolutional Networks. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:2990-2999 Available from https://proceedings.mlr.press/v48/cohenc16.html.

Related Material