Inexact Tensor Methods with Dynamic Accuracies

Nikita Doikov, Yurii Nesterov
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:2577-2586, 2020.

Abstract

In this paper, we study inexact high-order Tensor Methods for solving convex optimization problems with composite objective. At every step of such methods, we use approximate solution of the auxiliary problem, defined by the bound for the residual in function value. We propose two dynamic strategies for choosing the inner accuracy: the first one is decreasing as $1/k^{p + 1}$, where $p \geq 1$ is the order of the method and $k$ is the iteration counter, and the second approach is using for the inner accuracy the last progress in the target objective. We show that inexact Tensor Methods with these strategies achieve the same global convergence rate as in the error-free case. For the second approach we also establish local superlinear rates (for $p \geq 2$), and propose the accelerated scheme. Lastly, we present computational results on a variety of machine learning problems for several methods and different accuracy policies.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-doikov20a, title = {Inexact Tensor Methods with Dynamic Accuracies}, author = {Doikov, Nikita and Nesterov, Yurii}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {2577--2586}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/doikov20a/doikov20a.pdf}, url = {https://proceedings.mlr.press/v119/doikov20a.html}, abstract = {In this paper, we study inexact high-order Tensor Methods for solving convex optimization problems with composite objective. At every step of such methods, we use approximate solution of the auxiliary problem, defined by the bound for the residual in function value. We propose two dynamic strategies for choosing the inner accuracy: the first one is decreasing as $1/k^{p + 1}$, where $p \geq 1$ is the order of the method and $k$ is the iteration counter, and the second approach is using for the inner accuracy the last progress in the target objective. We show that inexact Tensor Methods with these strategies achieve the same global convergence rate as in the error-free case. For the second approach we also establish local superlinear rates (for $p \geq 2$), and propose the accelerated scheme. Lastly, we present computational results on a variety of machine learning problems for several methods and different accuracy policies.} }
Endnote
%0 Conference Paper %T Inexact Tensor Methods with Dynamic Accuracies %A Nikita Doikov %A Yurii Nesterov %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-doikov20a %I PMLR %P 2577--2586 %U https://proceedings.mlr.press/v119/doikov20a.html %V 119 %X In this paper, we study inexact high-order Tensor Methods for solving convex optimization problems with composite objective. At every step of such methods, we use approximate solution of the auxiliary problem, defined by the bound for the residual in function value. We propose two dynamic strategies for choosing the inner accuracy: the first one is decreasing as $1/k^{p + 1}$, where $p \geq 1$ is the order of the method and $k$ is the iteration counter, and the second approach is using for the inner accuracy the last progress in the target objective. We show that inexact Tensor Methods with these strategies achieve the same global convergence rate as in the error-free case. For the second approach we also establish local superlinear rates (for $p \geq 2$), and propose the accelerated scheme. Lastly, we present computational results on a variety of machine learning problems for several methods and different accuracy policies.
APA
Doikov, N. & Nesterov, Y.. (2020). Inexact Tensor Methods with Dynamic Accuracies. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:2577-2586 Available from https://proceedings.mlr.press/v119/doikov20a.html.

Related Material