On the Computational Power of Online Gradient Descent
[edit]
Proceedings of the ThirtySecond Conference on Learning Theory, PMLR 99:624662, 2019.
Abstract
We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomialspace computations, even in very simple learning settings. Our results imply that, under weak complexitytheoretic assumptions, it is impossible to reason efficiently about the finegrained behavior of online gradient descent.
Related Material


