Complexities in ProjectionFree Stochastic Nonconvex Minimization
[edit]
Proceedings of Machine Learning Research, PMLR 89:28682876, 2019.
Abstract
For constrained nonconvex minimization problems, we propose a meta stochastic projectionfree optimization algorithm, named Normalized Frank Wolfe Updating, that can take any Gradient Estimator (GE) as input. For this algorithm, we prove its convergence rate, regardless of the choice of GE. Using a sophisticated GE, this algorithm can significantly improve the Stochastic First order Oracle (SFO) complexity. Further, a new second order GE strategy is proposed to incorporate curvature information, which enjoys theoretical advantage over the first order ones. Besides, this paper also provides a lower bound of Linearoptimization Oracle (LO) queried to achieve an approximate stationary point. Simulation studies validate our analysis under various parameter settings.
Related Material


