NYTRO: When Subsampling Meets Early Stopping

[edit]

Raffaello Camoriano, Tomás Angles, Alessandro Rudi, Lorenzo Rosasco ;
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1403-1411, 2016.

Abstract

Early stopping is a well known approach to reduce the time complexity for performing training and model selection of large scale learning machines. On the other hand, memory/space (rather than time) complexity is the main constraint in many applications, and randomized subsampling techniques have been proposed to tackle this issue. In this paper we ask whether early stopping and subsampling ideas can be combined in a fruitful way. We consider the question in a least squares regression setting and propose a form of randomized iterative regularization based on early stopping and subsampling. In this context, we analyze the statistical and computational properties of the proposed method. Theoretical results are complemented and validated by a thorough experimental analysis.

Related Material