Self-Concordant Analysis of Frank-Wolfe Algorithms

Pavel Dvurechensky, Petr Ostroukhov, Kamil Safin, Shimrit Shtern, Mathias Staudigl
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:2814-2824, 2020.

Abstract

Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-dvurechensky20a, title = {Self-Concordant Analysis of Frank-{W}olfe Algorithms}, author = {Dvurechensky, Pavel and Ostroukhov, Petr and Safin, Kamil and Shtern, Shimrit and Staudigl, Mathias}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {2814--2824}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/dvurechensky20a/dvurechensky20a.pdf}, url = {https://proceedings.mlr.press/v119/dvurechensky20a.html}, abstract = {Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.} }
Endnote
%0 Conference Paper %T Self-Concordant Analysis of Frank-Wolfe Algorithms %A Pavel Dvurechensky %A Petr Ostroukhov %A Kamil Safin %A Shimrit Shtern %A Mathias Staudigl %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-dvurechensky20a %I PMLR %P 2814--2824 %U https://proceedings.mlr.press/v119/dvurechensky20a.html %V 119 %X Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.
APA
Dvurechensky, P., Ostroukhov, P., Safin, K., Shtern, S. & Staudigl, M.. (2020). Self-Concordant Analysis of Frank-Wolfe Algorithms. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:2814-2824 Available from https://proceedings.mlr.press/v119/dvurechensky20a.html.

Related Material