[edit]
Condition-number-independent Convergence Rate of Riemannian Hamiltonian Monte Carlo with Numerical Integrators
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:4504-4569, 2023.
Abstract
We study the convergence rate of discretized Riemannian Hamiltonian Monte Carlo on sampling from distributions in the form of e−f(x) on a convex body M⊂\Rn. We show that for distributions in the form of e−α⊤x on a polytope with m constraints, the convergence rate of a family of commonly-used integrators is independent of ‖ and the geometry of the polytope. In particular, the implicit midpoint method (IMM) and the generalized Leapfrog method (LM) have a mixing time of \widetilde{O}\left(mn^{3}\right) to achieve \epsilon total variation distance to the target distribution. These guarantees are based on a general bound on the convergence rate for densities of the form e^{-f(x)} in terms of parameters of the manifold and the integrator. Our theoretical guarantee complements the empirical results of \cite{kook2022sampling}, which shows that RHMC with IMM can sample ill-conditioned, non-smooth and constrained distributions in very high dimension efficiently in practice.