Flat Metric Minimization with Applications in Generative Modeling

Thomas Möllenhoff, Daniel Cremers
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4626-4635, 2019.

Abstract

We take the novel perspective to view data not as a probability distribution but rather as a current. Primarily studied in the field of geometric measure theory, k-currents are continuous linear functionals acting on compactly supported smooth differential forms and can be understood as a generalized notion of oriented k-dimensional manifold. By moving from distributions (which are 0-currents) to k-currents, we can explicitly orient the data by attaching a k-dimensional tangent plane to each sample point. Based on the flat metric which is a fundamental distance between currents, we derive FlatGAN, a formulation in the spirit of generative adversarial networks but generalized to k-currents. In our theoretical contribution we prove that the flat metric between a parametrized current and a reference current is Lipschitz continuous in the parameters. In experiments, we show that the proposed shift to k>0 leads to interpretable and disentangled latent representations which behave equivariantly to the specified oriented tangent planes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-mollenhoff19a, title = {Flat Metric Minimization with Applications in Generative Modeling}, author = {M{\"o}llenhoff, Thomas and Cremers, Daniel}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4626--4635}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/mollenhoff19a/mollenhoff19a.pdf}, url = {https://proceedings.mlr.press/v97/mollenhoff19a.html}, abstract = {We take the novel perspective to view data not as a probability distribution but rather as a current. Primarily studied in the field of geometric measure theory, k-currents are continuous linear functionals acting on compactly supported smooth differential forms and can be understood as a generalized notion of oriented k-dimensional manifold. By moving from distributions (which are 0-currents) to k-currents, we can explicitly orient the data by attaching a k-dimensional tangent plane to each sample point. Based on the flat metric which is a fundamental distance between currents, we derive FlatGAN, a formulation in the spirit of generative adversarial networks but generalized to k-currents. In our theoretical contribution we prove that the flat metric between a parametrized current and a reference current is Lipschitz continuous in the parameters. In experiments, we show that the proposed shift to k>0 leads to interpretable and disentangled latent representations which behave equivariantly to the specified oriented tangent planes.} }
Endnote
%0 Conference Paper %T Flat Metric Minimization with Applications in Generative Modeling %A Thomas Möllenhoff %A Daniel Cremers %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-mollenhoff19a %I PMLR %P 4626--4635 %U https://proceedings.mlr.press/v97/mollenhoff19a.html %V 97 %X We take the novel perspective to view data not as a probability distribution but rather as a current. Primarily studied in the field of geometric measure theory, k-currents are continuous linear functionals acting on compactly supported smooth differential forms and can be understood as a generalized notion of oriented k-dimensional manifold. By moving from distributions (which are 0-currents) to k-currents, we can explicitly orient the data by attaching a k-dimensional tangent plane to each sample point. Based on the flat metric which is a fundamental distance between currents, we derive FlatGAN, a formulation in the spirit of generative adversarial networks but generalized to k-currents. In our theoretical contribution we prove that the flat metric between a parametrized current and a reference current is Lipschitz continuous in the parameters. In experiments, we show that the proposed shift to k>0 leads to interpretable and disentangled latent representations which behave equivariantly to the specified oriented tangent planes.
APA
Möllenhoff, T. & Cremers, D.. (2019). Flat Metric Minimization with Applications in Generative Modeling. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4626-4635 Available from https://proceedings.mlr.press/v97/mollenhoff19a.html.

Related Material