Optimal Rates of Sketched-regularized Algorithms for Least-Squares Regression over Hilbert Spaces

Junhong Lin, Volkan Cevher
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3102-3111, 2018.

Abstract

We investigate regularized algorithms combining with projection for least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space. We prove convergence results with respect to variants of norms, under a capacity assumption on the hypothesis space and a regularity condition on the target function. As a result, we obtain optimal rates for regularized algorithms with randomized sketches, provided that the sketch dimension is proportional to the effective dimension up to a logarithmic factor. As a byproduct, we obtain similar results for Nyström regularized algorithms. Our results provide optimal, distribution-dependent rates for sketched/Nyström regularized algorithms, considering both the attainable and non-attainable cases.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-lin18b, title = {Optimal Rates of Sketched-regularized Algorithms for Least-Squares Regression over {H}ilbert Spaces}, author = {Lin, Junhong and Cevher, Volkan}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3102--3111}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/lin18b/lin18b.pdf}, url = {https://proceedings.mlr.press/v80/lin18b.html}, abstract = {We investigate regularized algorithms combining with projection for least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space. We prove convergence results with respect to variants of norms, under a capacity assumption on the hypothesis space and a regularity condition on the target function. As a result, we obtain optimal rates for regularized algorithms with randomized sketches, provided that the sketch dimension is proportional to the effective dimension up to a logarithmic factor. As a byproduct, we obtain similar results for Nyström regularized algorithms. Our results provide optimal, distribution-dependent rates for sketched/Nyström regularized algorithms, considering both the attainable and non-attainable cases.} }
Endnote
%0 Conference Paper %T Optimal Rates of Sketched-regularized Algorithms for Least-Squares Regression over Hilbert Spaces %A Junhong Lin %A Volkan Cevher %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-lin18b %I PMLR %P 3102--3111 %U https://proceedings.mlr.press/v80/lin18b.html %V 80 %X We investigate regularized algorithms combining with projection for least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space. We prove convergence results with respect to variants of norms, under a capacity assumption on the hypothesis space and a regularity condition on the target function. As a result, we obtain optimal rates for regularized algorithms with randomized sketches, provided that the sketch dimension is proportional to the effective dimension up to a logarithmic factor. As a byproduct, we obtain similar results for Nyström regularized algorithms. Our results provide optimal, distribution-dependent rates for sketched/Nyström regularized algorithms, considering both the attainable and non-attainable cases.
APA
Lin, J. & Cevher, V.. (2018). Optimal Rates of Sketched-regularized Algorithms for Least-Squares Regression over Hilbert Spaces. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3102-3111 Available from https://proceedings.mlr.press/v80/lin18b.html.

Related Material