Towards Memory-Friendly Deterministic Incremental Gradient Method

Jiahao Xie, Hui Qian, Zebang Shen, Chao Zhang
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1147-1156, 2018.

Abstract

Incremental Gradient (IG) methods are classical strategies in solving finite sum minimization problems. Deterministic IG methods are particularly favorable in handling massive scale problem due to its memory-friendly data access pattern. In this paper, we propose a new deterministic variant of the IG method SVRG that blends a periodically updated full gradient with a component function gradient selected in a cyclic order. Our method uses only $O(1)$ extra gradient storage without compromising the linear convergence. Empirical results demonstrate that the proposed method is advantageous over existing incremental gradient algorithms, especially on problems that does not fit into physical memory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-xie18a, title = {Towards Memory-Friendly Deterministic Incremental Gradient Method}, author = {Xie, Jiahao and Qian, Hui and Shen, Zebang and Zhang, Chao}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1147--1156}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/xie18a/xie18a.pdf}, url = {https://proceedings.mlr.press/v84/xie18a.html}, abstract = {Incremental Gradient (IG) methods are classical strategies in solving finite sum minimization problems. Deterministic IG methods are particularly favorable in handling massive scale problem due to its memory-friendly data access pattern. In this paper, we propose a new deterministic variant of the IG method SVRG that blends a periodically updated full gradient with a component function gradient selected in a cyclic order. Our method uses only $O(1)$ extra gradient storage without compromising the linear convergence. Empirical results demonstrate that the proposed method is advantageous over existing incremental gradient algorithms, especially on problems that does not fit into physical memory. } }
Endnote
%0 Conference Paper %T Towards Memory-Friendly Deterministic Incremental Gradient Method %A Jiahao Xie %A Hui Qian %A Zebang Shen %A Chao Zhang %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-xie18a %I PMLR %P 1147--1156 %U https://proceedings.mlr.press/v84/xie18a.html %V 84 %X Incremental Gradient (IG) methods are classical strategies in solving finite sum minimization problems. Deterministic IG methods are particularly favorable in handling massive scale problem due to its memory-friendly data access pattern. In this paper, we propose a new deterministic variant of the IG method SVRG that blends a periodically updated full gradient with a component function gradient selected in a cyclic order. Our method uses only $O(1)$ extra gradient storage without compromising the linear convergence. Empirical results demonstrate that the proposed method is advantageous over existing incremental gradient algorithms, especially on problems that does not fit into physical memory.
APA
Xie, J., Qian, H., Shen, Z. & Zhang, C.. (2018). Towards Memory-Friendly Deterministic Incremental Gradient Method. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1147-1156 Available from https://proceedings.mlr.press/v84/xie18a.html.

Related Material