[edit]
Improved Analysis of Score-based Generative Modeling: User-Friendly Bounds under Minimal Smoothness Assumptions
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:4735-4763, 2023.
Abstract
We give an improved theoretical analysis of score-based generative modeling. Under a score estimate with small L2 error (averaged across timesteps), we provide efficient convergence guarantees for any data distribution with second-order moment, by either employing early stopping or assuming smoothness condition on the score function of the data distribution. Our result does not rely on any log-concavity or functional inequality assumption and has a logarithmic dependence on the smoothness. In particular, we show that under only a finite second moment condition, approximating the following in reverse KL divergence in ϵ-accuracy can be done in ˜O(dlog(1/δ)ϵ) steps: 1) the variance-δ Gaussian perturbation of any data distribution; 2) data distributions with 1/δ-smooth score functions. Our analysis also provides a quantitative comparison between different discrete approximations and may guide the choice of discretization points in practice.