Learning the Irreducible Representations of Commutative Lie Groups

Taco Cohen, Max Welling
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1755-1763, 2014.

Abstract

We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data. To define the notion of disentangling, we borrow a fundamental principle from physics that is used to derive the elementary particles of a system from its symmetries. Our model employs a newfound Bayesian conjugacy relation that enables fully tractable probabilistic inference over compact commutative Lie groups – a class that includes the groups that describe the rotation and cyclic translation of images. We train the model on pairs of transformed image patches, and show that the learned invariant representation is highly effective for classification.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-cohen14, title = {Learning the Irreducible Representations of Commutative Lie Groups}, author = {Cohen, Taco and Welling, Max}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1755--1763}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/cohen14.pdf}, url = {https://proceedings.mlr.press/v32/cohen14.html}, abstract = {We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data. To define the notion of disentangling, we borrow a fundamental principle from physics that is used to derive the elementary particles of a system from its symmetries. Our model employs a newfound Bayesian conjugacy relation that enables fully tractable probabilistic inference over compact commutative Lie groups – a class that includes the groups that describe the rotation and cyclic translation of images. We train the model on pairs of transformed image patches, and show that the learned invariant representation is highly effective for classification.} }
Endnote
%0 Conference Paper %T Learning the Irreducible Representations of Commutative Lie Groups %A Taco Cohen %A Max Welling %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-cohen14 %I PMLR %P 1755--1763 %U https://proceedings.mlr.press/v32/cohen14.html %V 32 %N 2 %X We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data. To define the notion of disentangling, we borrow a fundamental principle from physics that is used to derive the elementary particles of a system from its symmetries. Our model employs a newfound Bayesian conjugacy relation that enables fully tractable probabilistic inference over compact commutative Lie groups – a class that includes the groups that describe the rotation and cyclic translation of images. We train the model on pairs of transformed image patches, and show that the learned invariant representation is highly effective for classification.
RIS
TY - CPAPER TI - Learning the Irreducible Representations of Commutative Lie Groups AU - Taco Cohen AU - Max Welling BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-cohen14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1755 EP - 1763 L1 - http://proceedings.mlr.press/v32/cohen14.pdf UR - https://proceedings.mlr.press/v32/cohen14.html AB - We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data. To define the notion of disentangling, we borrow a fundamental principle from physics that is used to derive the elementary particles of a system from its symmetries. Our model employs a newfound Bayesian conjugacy relation that enables fully tractable probabilistic inference over compact commutative Lie groups – a class that includes the groups that describe the rotation and cyclic translation of images. We train the model on pairs of transformed image patches, and show that the learned invariant representation is highly effective for classification. ER -
APA
Cohen, T. & Welling, M.. (2014). Learning the Irreducible Representations of Commutative Lie Groups. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1755-1763 Available from https://proceedings.mlr.press/v32/cohen14.html.

Related Material