[edit]
Fisher information lower bounds for sampling
Proceedings of The 34th International Conference on Algorithmic Learning Theory, PMLR 201:375-410, 2023.
Abstract
We prove two lower bounds for the complexity of non-log-concave sampling within the framework of Balasubramanian et al. (2022), who introduced the use of Fisher information ($\mathsf{FI}$) bounds as a notion of approximate first-order stationarity in sampling. Our first lower bound shows that averaged Langevin Monte Carlo (LMC) is optimal for the regime of large $\mathsf{FI}$ by reducing the problem of finding stationary points in non-convex optimization to sampling. Our second lower bound shows that in the regime of small $\mathsf{FI}$, obtaining a $\mathsf{FI}$ of at most $\varepsilon^2$ from the target distribution requires $\text{poly}(1/\varepsilon)$ queries, which is surprising as it rules out the existence of high-accuracy algorithms (e.g., algorithms using Metropolis{–}Hastings filters) in this context.