[edit]
Gaussian-Smoothed Optimal Transport: Metric Structure and Statistical Efficiency
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:3327-3337, 2020.
Abstract
Optimal transport (OT), and in particular the Wasserstein distance, has seen a surge of interest and applications in machine learning. However, empirical approximation under Wasserstein distances suffers from a severe curse of dimensionality, rendering them impractical in high dimensions. As a result, entropically regularized OT has become a popular workaround. However, while it enjoys fast algorithms and better statistical properties, it looses the metric structure that Wasserstein distances enjoy. This work proposes a novel Gaussian-smoothed OT (GOT) framework, that achieves the best of both worlds: preserving the 1-Wasserstein metric structure while alleviating the empirical approximation curse of dimensionality. Furthermore, as the Gaussian-smoothing parameter shrinks to zero, GOT $\Gamma$-converges towards classic OT (with convergence of optimizers), thus serving as a natural extension. An empirical study that validates the theoretical results is provided, promoting Gaussian-smoothed OT as a powerful alternative to entropic OT.