Model Function Based Conditional Gradient Method with Armijolike Line Search
[edit]
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:48914900, 2019.
Abstract
The Conditional Gradient Method is generalized to a class of nonsmooth nonconvex optimization problems with many applications in machine learning. The proposed algorithm iterates by minimizing socalled model functions over the constraint set. Complemented with an Armijo line search procedure, we prove that subsequences converge to a stationary point. The abstract framework of model functions provides great flexibility in the design of concrete algorithms. As special cases, for example, we develop an algorithm for additive composite problems and an algorithm for nonlinear composite problems which leads to a GaussNewtontype algorithm. Both instances are novel in nonsmooth nonconvex optimization and come with numerous applications in machine learning. We perform an experiment on a nonlinear robust regression problem and discuss the flexibility of the proposed framework in several matrix factorization formulations.
Related Material


