Stochastic Gradient Flow Dynamics of Test Risk and its Exact Solution for Weak Features

Rodrigo Veiga, Anastasia Remizova, Nicolas Macris
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:49310-49344, 2024.

Abstract

We investigate the test risk of a continuous time stochastic gradient flow dynamics in learning theory. Using a path integral formulation we provide, in the regime of small learning rate, a general formula for computing the difference between test risk curves of pure gradient and stochastic gradient flows. We apply the general theory to a simple model of weak features, which displays the double descent phenomenon, and explicitly compute the corrections brought about by the added stochastic term in the dynamics, as a function of time and model parameters. The analytical results are compared to simulations of discrete time stochastic gradient descent and show good agreement.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-veiga24a, title = {Stochastic Gradient Flow Dynamics of Test Risk and its Exact Solution for Weak Features}, author = {Veiga, Rodrigo and Remizova, Anastasia and Macris, Nicolas}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {49310--49344}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/veiga24a/veiga24a.pdf}, url = {https://proceedings.mlr.press/v235/veiga24a.html}, abstract = {We investigate the test risk of a continuous time stochastic gradient flow dynamics in learning theory. Using a path integral formulation we provide, in the regime of small learning rate, a general formula for computing the difference between test risk curves of pure gradient and stochastic gradient flows. We apply the general theory to a simple model of weak features, which displays the double descent phenomenon, and explicitly compute the corrections brought about by the added stochastic term in the dynamics, as a function of time and model parameters. The analytical results are compared to simulations of discrete time stochastic gradient descent and show good agreement.} }
Endnote
%0 Conference Paper %T Stochastic Gradient Flow Dynamics of Test Risk and its Exact Solution for Weak Features %A Rodrigo Veiga %A Anastasia Remizova %A Nicolas Macris %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-veiga24a %I PMLR %P 49310--49344 %U https://proceedings.mlr.press/v235/veiga24a.html %V 235 %X We investigate the test risk of a continuous time stochastic gradient flow dynamics in learning theory. Using a path integral formulation we provide, in the regime of small learning rate, a general formula for computing the difference between test risk curves of pure gradient and stochastic gradient flows. We apply the general theory to a simple model of weak features, which displays the double descent phenomenon, and explicitly compute the corrections brought about by the added stochastic term in the dynamics, as a function of time and model parameters. The analytical results are compared to simulations of discrete time stochastic gradient descent and show good agreement.
APA
Veiga, R., Remizova, A. & Macris, N.. (2024). Stochastic Gradient Flow Dynamics of Test Risk and its Exact Solution for Weak Features. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:49310-49344 Available from https://proceedings.mlr.press/v235/veiga24a.html.

Related Material