Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions

Antoine Liutkus, Umut Simsekli, Szymon Majewski, Alain Durmus, Fabian-Robert Stöter
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4104-4113, 2019.

Abstract

By building upon the recent theory that established the connection between implicit generative modeling (IGM) and optimal transport, in this study, we propose a novel parameter-free algorithm for learning the underlying distributions of complicated datasets and sampling from them. The proposed algorithm is based on a functional optimization problem, which aims at finding a measure that is close to the data distribution as much as possible and also expressive enough for generative modeling purposes. We formulate the problem as a gradient flow in the space of probability measures. The connections between gradient flows and stochastic differential equations let us develop a computationally efficient algorithm for solving the optimization problem. We provide formal theoretical analysis where we prove finite-time error guarantees for the proposed algorithm. To the best of our knowledge, the proposed algorithm is the first nonparametric IGM algorithm with explicit theoretical guarantees. Our experimental results support our theory and show that our algorithm is able to successfully capture the structure of different types of data distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-liutkus19a, title = {Sliced-{W}asserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions}, author = {Liutkus, Antoine and Simsekli, Umut and Majewski, Szymon and Durmus, Alain and St{\"o}ter, Fabian-Robert}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4104--4113}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/liutkus19a/liutkus19a.pdf}, url = {https://proceedings.mlr.press/v97/liutkus19a.html}, abstract = {By building upon the recent theory that established the connection between implicit generative modeling (IGM) and optimal transport, in this study, we propose a novel parameter-free algorithm for learning the underlying distributions of complicated datasets and sampling from them. The proposed algorithm is based on a functional optimization problem, which aims at finding a measure that is close to the data distribution as much as possible and also expressive enough for generative modeling purposes. We formulate the problem as a gradient flow in the space of probability measures. The connections between gradient flows and stochastic differential equations let us develop a computationally efficient algorithm for solving the optimization problem. We provide formal theoretical analysis where we prove finite-time error guarantees for the proposed algorithm. To the best of our knowledge, the proposed algorithm is the first nonparametric IGM algorithm with explicit theoretical guarantees. Our experimental results support our theory and show that our algorithm is able to successfully capture the structure of different types of data distributions.} }
Endnote
%0 Conference Paper %T Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions %A Antoine Liutkus %A Umut Simsekli %A Szymon Majewski %A Alain Durmus %A Fabian-Robert Stöter %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-liutkus19a %I PMLR %P 4104--4113 %U https://proceedings.mlr.press/v97/liutkus19a.html %V 97 %X By building upon the recent theory that established the connection between implicit generative modeling (IGM) and optimal transport, in this study, we propose a novel parameter-free algorithm for learning the underlying distributions of complicated datasets and sampling from them. The proposed algorithm is based on a functional optimization problem, which aims at finding a measure that is close to the data distribution as much as possible and also expressive enough for generative modeling purposes. We formulate the problem as a gradient flow in the space of probability measures. The connections between gradient flows and stochastic differential equations let us develop a computationally efficient algorithm for solving the optimization problem. We provide formal theoretical analysis where we prove finite-time error guarantees for the proposed algorithm. To the best of our knowledge, the proposed algorithm is the first nonparametric IGM algorithm with explicit theoretical guarantees. Our experimental results support our theory and show that our algorithm is able to successfully capture the structure of different types of data distributions.
APA
Liutkus, A., Simsekli, U., Majewski, S., Durmus, A. & Stöter, F.. (2019). Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4104-4113 Available from https://proceedings.mlr.press/v97/liutkus19a.html.

Related Material