Gradient Descent Only Converges to Minimizers

[edit]

Jason D. Lee, Max Simchowitz, Michael I. Jordan, Benjamin Recht ;
29th Annual Conference on Learning Theory, PMLR 49:1246-1257, 2016.

Abstract

We show that gradient descent converges to a local minimizer, almost surely with random initial- ization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.

Related Material