Learning Discrete and Continuous Factors of Data via Alternating Disentanglement

Yeonwoo Jeong, Hyun Oh Song
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:3091-3099, 2019.

Abstract

We address the problem of unsupervised disentanglement of discrete and continuous explanatory factors of data. We first show a simple procedure for minimizing the total correlation of the continuous latent variables without having to use a discriminator network or perform importance sampling, via cascading the information flow in the beta-VAE framework. Furthermore, we propose a method which avoids offloading the entire burden of jointly modeling the continuous and discrete factors to the variational encoder by employing a separate discrete inference procedure. This leads to an interesting alternating minimization problem which switches between finding the most likely discrete configuration given the continuous factors and updating the variational encoder based on the computed discrete factors. Experiments show that the proposed method clearly disentangles discrete factors and significantly outperforms current disentanglement methods based on the disentanglement score and inference network classification score. The source code is available at https://github.com/snumllab/DisentanglementICML19.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-jeong19d, title = {Learning Discrete and Continuous Factors of Data via Alternating Disentanglement}, author = {Jeong, Yeonwoo and Song, Hyun Oh}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {3091--3099}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/jeong19d/jeong19d.pdf}, url = {https://proceedings.mlr.press/v97/jeong19d.html}, abstract = {We address the problem of unsupervised disentanglement of discrete and continuous explanatory factors of data. We first show a simple procedure for minimizing the total correlation of the continuous latent variables without having to use a discriminator network or perform importance sampling, via cascading the information flow in the beta-VAE framework. Furthermore, we propose a method which avoids offloading the entire burden of jointly modeling the continuous and discrete factors to the variational encoder by employing a separate discrete inference procedure. This leads to an interesting alternating minimization problem which switches between finding the most likely discrete configuration given the continuous factors and updating the variational encoder based on the computed discrete factors. Experiments show that the proposed method clearly disentangles discrete factors and significantly outperforms current disentanglement methods based on the disentanglement score and inference network classification score. The source code is available at https://github.com/snumllab/DisentanglementICML19.} }
Endnote
%0 Conference Paper %T Learning Discrete and Continuous Factors of Data via Alternating Disentanglement %A Yeonwoo Jeong %A Hyun Oh Song %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-jeong19d %I PMLR %P 3091--3099 %U https://proceedings.mlr.press/v97/jeong19d.html %V 97 %X We address the problem of unsupervised disentanglement of discrete and continuous explanatory factors of data. We first show a simple procedure for minimizing the total correlation of the continuous latent variables without having to use a discriminator network or perform importance sampling, via cascading the information flow in the beta-VAE framework. Furthermore, we propose a method which avoids offloading the entire burden of jointly modeling the continuous and discrete factors to the variational encoder by employing a separate discrete inference procedure. This leads to an interesting alternating minimization problem which switches between finding the most likely discrete configuration given the continuous factors and updating the variational encoder based on the computed discrete factors. Experiments show that the proposed method clearly disentangles discrete factors and significantly outperforms current disentanglement methods based on the disentanglement score and inference network classification score. The source code is available at https://github.com/snumllab/DisentanglementICML19.
APA
Jeong, Y. & Song, H.O.. (2019). Learning Discrete and Continuous Factors of Data via Alternating Disentanglement. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:3091-3099 Available from https://proceedings.mlr.press/v97/jeong19d.html.

Related Material