Langevin Monte Carlo and JKO splitting
[edit]
Proceedings of the 31st Conference On Learning Theory, PMLR 75:17771798, 2018.
Abstract
Algorithms based on discretizing Langevin diffusion are popular tools for sampling from highdimensional distributions. We develop novel connections between such Monte Carlo algorithms, the theory of Wasserstein gradient flow, and the operator splitting approach to solving PDEs. In particular, we show that a proximal version of the Unadjusted Langevin Algorithm corresponds to a scheme that alternates between solving the gradient flows of two specific functionals on the space of probability measures. Using this perspective, we derive some new nonasymptotic results on the convergence properties of this algorithm.
Related Material


