Model Function Based Conditional Gradient Method with Armijo-like Line Search

Peter Ochs, Yura Malitsky
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4891-4900, 2019.

Abstract

The Conditional Gradient Method is generalized to a class of non-smooth non-convex optimization problems with many applications in machine learning. The proposed algorithm iterates by minimizing so-called model functions over the constraint set. Complemented with an Armijo line search procedure, we prove that subsequences converge to a stationary point. The abstract framework of model functions provides great flexibility in the design of concrete algorithms. As special cases, for example, we develop an algorithm for additive composite problems and an algorithm for non-linear composite problems which leads to a Gauss-Newton-type algorithm. Both instances are novel in non-smooth non-convex optimization and come with numerous applications in machine learning. We perform an experiment on a non-linear robust regression problem and discuss the flexibility of the proposed framework in several matrix factorization formulations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-ochs19a, title = {Model Function Based Conditional Gradient Method with Armijo-like Line Search}, author = {Ochs, Peter and Malitsky, Yura}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4891--4900}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/ochs19a/ochs19a.pdf}, url = {https://proceedings.mlr.press/v97/ochs19a.html}, abstract = {The Conditional Gradient Method is generalized to a class of non-smooth non-convex optimization problems with many applications in machine learning. The proposed algorithm iterates by minimizing so-called model functions over the constraint set. Complemented with an Armijo line search procedure, we prove that subsequences converge to a stationary point. The abstract framework of model functions provides great flexibility in the design of concrete algorithms. As special cases, for example, we develop an algorithm for additive composite problems and an algorithm for non-linear composite problems which leads to a Gauss-Newton-type algorithm. Both instances are novel in non-smooth non-convex optimization and come with numerous applications in machine learning. We perform an experiment on a non-linear robust regression problem and discuss the flexibility of the proposed framework in several matrix factorization formulations.} }
Endnote
%0 Conference Paper %T Model Function Based Conditional Gradient Method with Armijo-like Line Search %A Peter Ochs %A Yura Malitsky %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-ochs19a %I PMLR %P 4891--4900 %U https://proceedings.mlr.press/v97/ochs19a.html %V 97 %X The Conditional Gradient Method is generalized to a class of non-smooth non-convex optimization problems with many applications in machine learning. The proposed algorithm iterates by minimizing so-called model functions over the constraint set. Complemented with an Armijo line search procedure, we prove that subsequences converge to a stationary point. The abstract framework of model functions provides great flexibility in the design of concrete algorithms. As special cases, for example, we develop an algorithm for additive composite problems and an algorithm for non-linear composite problems which leads to a Gauss-Newton-type algorithm. Both instances are novel in non-smooth non-convex optimization and come with numerous applications in machine learning. We perform an experiment on a non-linear robust regression problem and discuss the flexibility of the proposed framework in several matrix factorization formulations.
APA
Ochs, P. & Malitsky, Y.. (2019). Model Function Based Conditional Gradient Method with Armijo-like Line Search. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4891-4900 Available from https://proceedings.mlr.press/v97/ochs19a.html.

Related Material