Learning the Irreducible Representations of Commutative Lie Groups

Taco Cohen, Max Welling
; Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1755-1763, 2014.

Abstract

We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data. To define the notion of disentangling, we borrow a fundamental principle from physics that is used to derive the elementary particles of a system from its symmetries. Our model employs a newfound Bayesian conjugacy relation that enables fully tractable probabilistic inference over compact commutative Lie groups – a class that includes the groups that describe the rotation and cyclic translation of images. We train the model on pairs of transformed image patches, and show that the learned invariant representation is highly effective for classification.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-cohen14, title = {Learning the Irreducible Representations of Commutative Lie Groups}, author = {Taco Cohen and Max Welling}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1755--1763}, year = {2014}, editor = {Eric P. Xing and Tony Jebara}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/cohen14.pdf}, url = {http://proceedings.mlr.press/v32/cohen14.html}, abstract = {We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data. To define the notion of disentangling, we borrow a fundamental principle from physics that is used to derive the elementary particles of a system from its symmetries. Our model employs a newfound Bayesian conjugacy relation that enables fully tractable probabilistic inference over compact commutative Lie groups – a class that includes the groups that describe the rotation and cyclic translation of images. We train the model on pairs of transformed image patches, and show that the learned invariant representation is highly effective for classification.} }
Endnote
%0 Conference Paper %T Learning the Irreducible Representations of Commutative Lie Groups %A Taco Cohen %A Max Welling %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-cohen14 %I PMLR %J Proceedings of Machine Learning Research %P 1755--1763 %U http://proceedings.mlr.press %V 32 %N 2 %W PMLR %X We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data. To define the notion of disentangling, we borrow a fundamental principle from physics that is used to derive the elementary particles of a system from its symmetries. Our model employs a newfound Bayesian conjugacy relation that enables fully tractable probabilistic inference over compact commutative Lie groups – a class that includes the groups that describe the rotation and cyclic translation of images. We train the model on pairs of transformed image patches, and show that the learned invariant representation is highly effective for classification.
RIS
TY - CPAPER TI - Learning the Irreducible Representations of Commutative Lie Groups AU - Taco Cohen AU - Max Welling BT - Proceedings of the 31st International Conference on Machine Learning PY - 2014/01/27 DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-cohen14 PB - PMLR SP - 1755 DP - PMLR EP - 1763 L1 - http://proceedings.mlr.press/v32/cohen14.pdf UR - http://proceedings.mlr.press/v32/cohen14.html AB - We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data. To define the notion of disentangling, we borrow a fundamental principle from physics that is used to derive the elementary particles of a system from its symmetries. Our model employs a newfound Bayesian conjugacy relation that enables fully tractable probabilistic inference over compact commutative Lie groups – a class that includes the groups that describe the rotation and cyclic translation of images. We train the model on pairs of transformed image patches, and show that the learned invariant representation is highly effective for classification. ER -
APA
Cohen, T. & Welling, M.. (2014). Learning the Irreducible Representations of Commutative Lie Groups. Proceedings of the 31st International Conference on Machine Learning, in PMLR 32(2):1755-1763

Related Material