Normal Approximation for Stochastic Gradient Descent via Non-Asymptotic Rates of Martingale CLT

[edit]

Andreas Anastasiou, Krishnakumar Balasubramanian, Murat A. Erdogdu ;
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:115-137, 2019.

Abstract

We provide non-asymptotic convergence rates of the Polyak-Ruppert averaged stochastic gradient descent (SGD) to a normal random vector for a class of twice-differentiable test functions. A crucial intermediate step is proving a non-asymptotic martingale central limit theorem (CLT), i.e., establishing the rates of convergence of a multivariate martingale difference sequence to a normal random vector, which might be of independent interest. We obtain the explicit rates for the multivariate martingale CLT using a combination of Stein?s method and Lindeberg?s argument, which is then used in conjunction with a non-asymptotic analysis of averaged SGD proposed in [PJ92]. Our results have potentially interesting consequences for computing confidence intervals for parameter estimation with SGD and constructing hypothesis tests with SGD that are valid in a non-asymptotic sense

Related Material