On the Private Estimation of Smooth Transport Maps

Clément Lalanne, Franck Iutzeler, Jean-Michel Loubes, Julien Chhor
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:32306-32338, 2025.

Abstract

Estimating optimal transport maps between two distributions from respective samples is an important element for many machine learning methods. To do so, rather than extending discrete transport maps, it has been shown that estimating the Brenier potential of the transport problem and obtaining a transport map through its gradient is near minimax optimal for smooth problems. In this paper, we investigate the private estimation of such potentials and transport maps with respect to the distribution samples. We propose a differentially private transport map estimator with $L^2$ error at most $n^{-1} \vee n^{-\frac{2 \alpha}{2 \alpha - 2 + d}} \vee (n\epsilon)^{-\frac{2 \alpha}{2 \alpha + d}} $ up do polylog terms where $n$ is the sample size, $\epsilon$ is the desired level of privacy, $\alpha$ is the smoothness of the true transport map, and $d$ is the dimension of the feature space. We also provide a lower bound for the problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-lalanne25a, title = {On the Private Estimation of Smooth Transport Maps}, author = {Lalanne, Cl\'{e}ment and Iutzeler, Franck and Loubes, Jean-Michel and Chhor, Julien}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {32306--32338}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/lalanne25a/lalanne25a.pdf}, url = {https://proceedings.mlr.press/v267/lalanne25a.html}, abstract = {Estimating optimal transport maps between two distributions from respective samples is an important element for many machine learning methods. To do so, rather than extending discrete transport maps, it has been shown that estimating the Brenier potential of the transport problem and obtaining a transport map through its gradient is near minimax optimal for smooth problems. In this paper, we investigate the private estimation of such potentials and transport maps with respect to the distribution samples. We propose a differentially private transport map estimator with $L^2$ error at most $n^{-1} \vee n^{-\frac{2 \alpha}{2 \alpha - 2 + d}} \vee (n\epsilon)^{-\frac{2 \alpha}{2 \alpha + d}} $ up do polylog terms where $n$ is the sample size, $\epsilon$ is the desired level of privacy, $\alpha$ is the smoothness of the true transport map, and $d$ is the dimension of the feature space. We also provide a lower bound for the problem.} }
Endnote
%0 Conference Paper %T On the Private Estimation of Smooth Transport Maps %A Clément Lalanne %A Franck Iutzeler %A Jean-Michel Loubes %A Julien Chhor %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-lalanne25a %I PMLR %P 32306--32338 %U https://proceedings.mlr.press/v267/lalanne25a.html %V 267 %X Estimating optimal transport maps between two distributions from respective samples is an important element for many machine learning methods. To do so, rather than extending discrete transport maps, it has been shown that estimating the Brenier potential of the transport problem and obtaining a transport map through its gradient is near minimax optimal for smooth problems. In this paper, we investigate the private estimation of such potentials and transport maps with respect to the distribution samples. We propose a differentially private transport map estimator with $L^2$ error at most $n^{-1} \vee n^{-\frac{2 \alpha}{2 \alpha - 2 + d}} \vee (n\epsilon)^{-\frac{2 \alpha}{2 \alpha + d}} $ up do polylog terms where $n$ is the sample size, $\epsilon$ is the desired level of privacy, $\alpha$ is the smoothness of the true transport map, and $d$ is the dimension of the feature space. We also provide a lower bound for the problem.
APA
Lalanne, C., Iutzeler, F., Loubes, J. & Chhor, J.. (2025). On the Private Estimation of Smooth Transport Maps. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:32306-32338 Available from https://proceedings.mlr.press/v267/lalanne25a.html.

Related Material