Reducing training time by efficient localized kernel regression

Nicole Müecke
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2603-2610, 2019.

Abstract

We study generalization properties of kernel regularized least squares regression based on a partitioning approach. We show that optimal rates of convergence are preserved if the number of local sets grows sufficiently slowly with the sample size. Moreover, the partitioning approach can be efficiently combined with local Nyström subsampling, improving computational cost twofold.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-muecke19a, title = {Reducing training time by efficient localized kernel regression}, author = {M\"{u}ecke, Nicole}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2603--2610}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/muecke19a/muecke19a.pdf}, url = {https://proceedings.mlr.press/v89/muecke19a.html}, abstract = {We study generalization properties of kernel regularized least squares regression based on a partitioning approach. We show that optimal rates of convergence are preserved if the number of local sets grows sufficiently slowly with the sample size. Moreover, the partitioning approach can be efficiently combined with local Nyström subsampling, improving computational cost twofold.} }
Endnote
%0 Conference Paper %T Reducing training time by efficient localized kernel regression %A Nicole Müecke %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-muecke19a %I PMLR %P 2603--2610 %U https://proceedings.mlr.press/v89/muecke19a.html %V 89 %X We study generalization properties of kernel regularized least squares regression based on a partitioning approach. We show that optimal rates of convergence are preserved if the number of local sets grows sufficiently slowly with the sample size. Moreover, the partitioning approach can be efficiently combined with local Nyström subsampling, improving computational cost twofold.
APA
Müecke, N.. (2019). Reducing training time by efficient localized kernel regression. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2603-2610 Available from https://proceedings.mlr.press/v89/muecke19a.html.

Related Material