An Accelerated DFO Algorithm for Finite-sum Convex Functions

Yuwen Chen, Antonio Orvieto, Aurelien Lucchi
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1681-1690, 2020.

Abstract

Derivative-free optimization (DFO) has recently gained a lot of momentum in machine learning, spawning interest in the community to design faster methods for problems where gradients are not accessible. While some attention has been given to the concept of acceleration in the DFO literature, existing stochastic algorithms for objective functions with a finite-sum structure have not been shown theoretically to achieve an accelerated rate of convergence. Algorithms that use acceleration in such a setting are prone to instabilities, making it difficult to reach convergence. In this work, we exploit the finite-sum structure of the objective in order to design a variance-reduced DFO algorithm that provably yields acceleration. We prove rates of convergence for both smooth convex and strongly-convex finite-sum objective functions. Finally, we validate our theoretical results empirically on several tasks and datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-chen20r, title = {An Accelerated {DFO} Algorithm for Finite-sum Convex Functions}, author = {Chen, Yuwen and Orvieto, Antonio and Lucchi, Aurelien}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1681--1690}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/chen20r/chen20r.pdf}, url = {https://proceedings.mlr.press/v119/chen20r.html}, abstract = {Derivative-free optimization (DFO) has recently gained a lot of momentum in machine learning, spawning interest in the community to design faster methods for problems where gradients are not accessible. While some attention has been given to the concept of acceleration in the DFO literature, existing stochastic algorithms for objective functions with a finite-sum structure have not been shown theoretically to achieve an accelerated rate of convergence. Algorithms that use acceleration in such a setting are prone to instabilities, making it difficult to reach convergence. In this work, we exploit the finite-sum structure of the objective in order to design a variance-reduced DFO algorithm that provably yields acceleration. We prove rates of convergence for both smooth convex and strongly-convex finite-sum objective functions. Finally, we validate our theoretical results empirically on several tasks and datasets.} }
Endnote
%0 Conference Paper %T An Accelerated DFO Algorithm for Finite-sum Convex Functions %A Yuwen Chen %A Antonio Orvieto %A Aurelien Lucchi %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-chen20r %I PMLR %P 1681--1690 %U https://proceedings.mlr.press/v119/chen20r.html %V 119 %X Derivative-free optimization (DFO) has recently gained a lot of momentum in machine learning, spawning interest in the community to design faster methods for problems where gradients are not accessible. While some attention has been given to the concept of acceleration in the DFO literature, existing stochastic algorithms for objective functions with a finite-sum structure have not been shown theoretically to achieve an accelerated rate of convergence. Algorithms that use acceleration in such a setting are prone to instabilities, making it difficult to reach convergence. In this work, we exploit the finite-sum structure of the objective in order to design a variance-reduced DFO algorithm that provably yields acceleration. We prove rates of convergence for both smooth convex and strongly-convex finite-sum objective functions. Finally, we validate our theoretical results empirically on several tasks and datasets.
APA
Chen, Y., Orvieto, A. & Lucchi, A.. (2020). An Accelerated DFO Algorithm for Finite-sum Convex Functions. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1681-1690 Available from https://proceedings.mlr.press/v119/chen20r.html.

Related Material