[edit]
Stochastic Gradient Flow Dynamics of Test Risk and its Exact Solution for Weak Features
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:49310-49344, 2024.
Abstract
We investigate the test risk of a continuous time stochastic gradient flow dynamics in learning theory. Using a path integral formulation we provide, in the regime of small learning rate, a general formula for computing the difference between test risk curves of pure gradient and stochastic gradient flows. We apply the general theory to a simple model of weak features, which displays the double descent phenomenon, and explicitly compute the corrections brought about by the added stochastic term in the dynamics, as a function of time and model parameters. The analytical results are compared to simulations of discrete time stochastic gradient descent and show good agreement.