Reducing training time by efficient localized kernel regression
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2603-2610, 2019.
We study generalization properties of kernel regularized least squares regression based on a partitioning approach. We show that optimal rates of convergence are preserved if the number of local sets grows sufficiently slowly with the sample size. Moreover, the partitioning approach can be efficiently combined with local Nyström subsampling, improving computational cost twofold.