Generative Conditional Distributions by Neural (Entropic) Optimal Transport

Bao Nguyen, Binh Nguyen, Hieu Trung Nguyen, Viet Anh Nguyen
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:37761-37775, 2024.

Abstract

Learning conditional distributions is challenging because the desired outcome is not a single distribution but multiple distributions that correspond to multiple instances of the covariates. We introduce a novel neural entropic optimal transport method designed to effectively learn generative models of conditional distributions, particularly in scenarios characterized by limited sample sizes. Our method relies on the minimax training of two neural networks: a generative network parametrizing the inverse cumulative distribution functions of the conditional distributions and another network parametrizing the conditional Kantorovich potential. To prevent overfitting, we regularize the objective function by penalizing the Lipschitz constant of the network output. Our experiments on real-world datasets show the effectiveness of our algorithm compared to state-of-the-art conditional distribution learning techniques. Our implementation can be found at https://github.com/nguyenngocbaocmt02/GENTLE.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-nguyen24h, title = {Generative Conditional Distributions by Neural ({E}ntropic) Optimal Transport}, author = {Nguyen, Bao and Nguyen, Binh and Nguyen, Hieu Trung and Nguyen, Viet Anh}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {37761--37775}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/nguyen24h/nguyen24h.pdf}, url = {https://proceedings.mlr.press/v235/nguyen24h.html}, abstract = {Learning conditional distributions is challenging because the desired outcome is not a single distribution but multiple distributions that correspond to multiple instances of the covariates. We introduce a novel neural entropic optimal transport method designed to effectively learn generative models of conditional distributions, particularly in scenarios characterized by limited sample sizes. Our method relies on the minimax training of two neural networks: a generative network parametrizing the inverse cumulative distribution functions of the conditional distributions and another network parametrizing the conditional Kantorovich potential. To prevent overfitting, we regularize the objective function by penalizing the Lipschitz constant of the network output. Our experiments on real-world datasets show the effectiveness of our algorithm compared to state-of-the-art conditional distribution learning techniques. Our implementation can be found at https://github.com/nguyenngocbaocmt02/GENTLE.} }
Endnote
%0 Conference Paper %T Generative Conditional Distributions by Neural (Entropic) Optimal Transport %A Bao Nguyen %A Binh Nguyen %A Hieu Trung Nguyen %A Viet Anh Nguyen %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-nguyen24h %I PMLR %P 37761--37775 %U https://proceedings.mlr.press/v235/nguyen24h.html %V 235 %X Learning conditional distributions is challenging because the desired outcome is not a single distribution but multiple distributions that correspond to multiple instances of the covariates. We introduce a novel neural entropic optimal transport method designed to effectively learn generative models of conditional distributions, particularly in scenarios characterized by limited sample sizes. Our method relies on the minimax training of two neural networks: a generative network parametrizing the inverse cumulative distribution functions of the conditional distributions and another network parametrizing the conditional Kantorovich potential. To prevent overfitting, we regularize the objective function by penalizing the Lipschitz constant of the network output. Our experiments on real-world datasets show the effectiveness of our algorithm compared to state-of-the-art conditional distribution learning techniques. Our implementation can be found at https://github.com/nguyenngocbaocmt02/GENTLE.
APA
Nguyen, B., Nguyen, B., Nguyen, H.T. & Nguyen, V.A.. (2024). Generative Conditional Distributions by Neural (Entropic) Optimal Transport. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:37761-37775 Available from https://proceedings.mlr.press/v235/nguyen24h.html.

Related Material