On the Computational Power of Online Gradient Descent

Vaggos Chatziafratis, Tim Roughgarden, Joshua R. Wang
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:624-662, 2019.

Abstract

We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in very simple learning settings. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behavior of online gradient descent.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-chatziafratis19a, title = {On the Computational Power of Online Gradient Descent}, author = {Chatziafratis, Vaggos and Roughgarden, Tim and Wang, Joshua R.}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {624--662}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/chatziafratis19a/chatziafratis19a.pdf}, url = {https://proceedings.mlr.press/v99/chatziafratis19a.html}, abstract = {We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in very simple learning settings. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behavior of online gradient descent.} }
Endnote
%0 Conference Paper %T On the Computational Power of Online Gradient Descent %A Vaggos Chatziafratis %A Tim Roughgarden %A Joshua R. Wang %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-chatziafratis19a %I PMLR %P 624--662 %U https://proceedings.mlr.press/v99/chatziafratis19a.html %V 99 %X We prove that the evolution of weight vectors in online gradient descent can encode arbitrary polynomial-space computations, even in very simple learning settings. Our results imply that, under weak complexity-theoretic assumptions, it is impossible to reason efficiently about the fine-grained behavior of online gradient descent.
APA
Chatziafratis, V., Roughgarden, T. & Wang, J.R.. (2019). On the Computational Power of Online Gradient Descent. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:624-662 Available from https://proceedings.mlr.press/v99/chatziafratis19a.html.

Related Material