Deep Generative Learning via Euler Particle Transport

Yuan Gao, Jian Huang, Yuling Jiao, Jin Liu, Xiliang Lu, Zhijian Yang
Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference, PMLR 145:336-368, 2022.

Abstract

We propose an Euler particle transport (EPT) approach to generative learning. EPT is motivated by the problem of constructing an optimal transport map from a reference distribution to a target distribution characterized by the Monge-Ampe‘re equation. Interpreting the infinitesimal linearization of the Monge-Ampe‘re equation from the perspective of gradient flows in measure spaces leads to a stochastic McKean-Vlasov equation. We use the forward Euler method to solve this equation. The resulting forward Euler map pushes forward a reference distribution to the target. This map is the composition of a sequence of simple residual maps, which are computationally stable and easy to train. The key task in training is the estimation of the density ratios or differences that determine the residual maps. We estimate the density ratios based on the Bregman divergence with a gradient penalty using deep density-ratio fitting. We show that the proposed density-ratio estimators do not suffer from the “curse of dimensionality” if data is supported on a lower-dimensional manifold. Numerical experiments with multi-mode synthetic datasets and comparisons with the existing methods on real benchmark datasets support our theoretical results and demonstrate the effectiveness of the proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v145-gao22a, title = {Deep Generative Learning via Euler Particle Transport}, author = {Gao, Yuan and Huang, Jian and Jiao, Yuling and Liu, Jin and Lu, Xiliang and Yang, Zhijian}, booktitle = {Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference}, pages = {336--368}, year = {2022}, editor = {Bruna, Joan and Hesthaven, Jan and Zdeborova, Lenka}, volume = {145}, series = {Proceedings of Machine Learning Research}, month = {16--19 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v145/gao22a/gao22a.pdf}, url = {https://proceedings.mlr.press/v145/gao22a.html}, abstract = {We propose an Euler particle transport (EPT) approach to generative learning. EPT is motivated by the problem of constructing an optimal transport map from a reference distribution to a target distribution characterized by the Monge-Ampe‘re equation. Interpreting the infinitesimal linearization of the Monge-Ampe‘re equation from the perspective of gradient flows in measure spaces leads to a stochastic McKean-Vlasov equation. We use the forward Euler method to solve this equation. The resulting forward Euler map pushes forward a reference distribution to the target. This map is the composition of a sequence of simple residual maps, which are computationally stable and easy to train. The key task in training is the estimation of the density ratios or differences that determine the residual maps. We estimate the density ratios based on the Bregman divergence with a gradient penalty using deep density-ratio fitting. We show that the proposed density-ratio estimators do not suffer from the “curse of dimensionality” if data is supported on a lower-dimensional manifold. Numerical experiments with multi-mode synthetic datasets and comparisons with the existing methods on real benchmark datasets support our theoretical results and demonstrate the effectiveness of the proposed method. } }
Endnote
%0 Conference Paper %T Deep Generative Learning via Euler Particle Transport %A Yuan Gao %A Jian Huang %A Yuling Jiao %A Jin Liu %A Xiliang Lu %A Zhijian Yang %B Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference %C Proceedings of Machine Learning Research %D 2022 %E Joan Bruna %E Jan Hesthaven %E Lenka Zdeborova %F pmlr-v145-gao22a %I PMLR %P 336--368 %U https://proceedings.mlr.press/v145/gao22a.html %V 145 %X We propose an Euler particle transport (EPT) approach to generative learning. EPT is motivated by the problem of constructing an optimal transport map from a reference distribution to a target distribution characterized by the Monge-Ampe‘re equation. Interpreting the infinitesimal linearization of the Monge-Ampe‘re equation from the perspective of gradient flows in measure spaces leads to a stochastic McKean-Vlasov equation. We use the forward Euler method to solve this equation. The resulting forward Euler map pushes forward a reference distribution to the target. This map is the composition of a sequence of simple residual maps, which are computationally stable and easy to train. The key task in training is the estimation of the density ratios or differences that determine the residual maps. We estimate the density ratios based on the Bregman divergence with a gradient penalty using deep density-ratio fitting. We show that the proposed density-ratio estimators do not suffer from the “curse of dimensionality” if data is supported on a lower-dimensional manifold. Numerical experiments with multi-mode synthetic datasets and comparisons with the existing methods on real benchmark datasets support our theoretical results and demonstrate the effectiveness of the proposed method.
APA
Gao, Y., Huang, J., Jiao, Y., Liu, J., Lu, X. & Yang, Z.. (2022). Deep Generative Learning via Euler Particle Transport. Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference, in Proceedings of Machine Learning Research 145:336-368 Available from https://proceedings.mlr.press/v145/gao22a.html.

Related Material