Katyusha X: Simple Momentum Method for Stochastic SumofNonconvex Optimization
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:179185, 2018.
Abstract
The problem of minimizing sumofnonconvex functions (i.e., convex functions that are average of nonconvex ones) is becoming increasing important in machine learning, and is the core machinery for PCA, SVD, regularized Newton’s method, accelerated nonconvex optimization, and more. We show how to provably obtain an accelerated stochastic algorithm for minimizing sumofnonconvex functions, by adding one additional line to the wellknown SVRG method. This line corresponds to momentum, and shows how to directly apply momentum to the finitesum stochastic minimization of sumofnonconvex functions. As a side result, our method enjoys linear parallel speedup using minibatch.
Related Material


