Complexities in Projection-Free Stochastic Non-convex Minimization

[edit]

Zebang Shen, Cong Fang, Peilin Zhao, Junzhou Huang, Hui Qian ;
Proceedings of Machine Learning Research, PMLR 89:2868-2876, 2019.

Abstract

For constrained nonconvex minimization problems, we propose a meta stochastic projection-free optimization algorithm, named Normalized Frank Wolfe Updating, that can take any Gradient Estimator (GE) as input. For this algorithm, we prove its convergence rate, regardless of the choice of GE. Using a sophisticated GE, this algorithm can significantly improve the Stochastic First order Oracle (SFO) complexity. Further, a new second order GE strategy is proposed to incorporate curvature information, which enjoys theoretical advantage over the first order ones. Besides, this paper also provides a lower bound of Linear-optimization Oracle (LO) queried to achieve an approximate stationary point. Simulation studies validate our analysis under various parameter settings.

Related Material