On the Computational Power of Online Gradient Descent

[edit]

Vaggos Chatziafratis, Tim Roughgarden, Joshua R. Wang ;
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:624-662, 2019.

Abstract

We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in very simple learning settings. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behavior of online gradient descent.

Related Material