Towards Memory-Friendly Deterministic Incremental Gradient Method

[edit]

Jiahao Xie, Hui Qian, Zebang Shen, Chao Zhang ;
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1147-1156, 2018.

Abstract

Incremental Gradient (IG) methods are classical strategies in solving finite sum minimization problems. Deterministic IG methods are particularly favorable in handling massive scale problem due to its memory-friendly data access pattern. In this paper, we propose a new deterministic variant of the IG method SVRG that blends a periodically updated full gradient with a component function gradient selected in a cyclic order. Our method uses only $O(1)$ extra gradient storage without compromising the linear convergence. Empirical results demonstrate that the proposed method is advantageous over existing incremental gradient algorithms, especially on problems that does not fit into physical memory.

Related Material