[edit]
On the interplay between noise and curvature and its effect on optimization and generalization
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:3503-3513, 2020.
Abstract
The speed at which one can minimize an expected loss using stochastic methods depends on two properties: the curvature of the loss and the variance of the gradients. While most previous works focus on one or the other of these properties, we explore how their interaction affects optimization speed. Further, as the ultimate goal is good generalization performance, we clarify how both curvature and noise are relevant to properly estimate the generalization gap. Realizing that the limitations of some existing works stems from a confusion between these matrices, we also clarify the distinction between the Fisher matrix, the Hessian, and the covariance matrix of the gradients.