Katalyst: Boosting Convex Katayusha for NonConvex Problems with a Large Condition Number
[edit]
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:11021111, 2019.
Abstract
An important class of nonconvex objectives that has wide applications in machine learning consists of a sum of $n$ smooth functions and a nonsmooth convex function. Tremendous studies have been devoted to conquering these problems by leveraging one of the two types of variance reduction techniques, i.e., SVRGtype that computes a full gradient occasionally and SAGAtype that maintains $n$ stochastic gradients at every iteration. In practice, SVRGtype is preferred to SAGAtype due to its potentially less memory costs. An interesting question that has been largely ignored is how to improve the complexity of variance reduction methods for problems with a large condition number that measures the degree to which the objective is close to a convex function. In this paper, we present a simple but nontrivial boosting of a stateoftheart SVRGtype method for convex problems (namely Katyusha) to enjoy an improved complexity for solving nonconvex problems with a large condition number (that is close to a convex function). To the best of our knowledge, its complexity has the best dependence on $n$ and the degree of nonconvexity, and also matches that of a recent SAGAtype accelerated stochastic algorithm for a constrained nonconvex smooth optimization problem.
Related Material


