Scaling up Kernel Ridge Regression via Locality Sensitive Hashing

Amir Zandieh, Navid Nouri, Ameya Velingker, Michael Kapralov, Ilya Razenshteyn
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:4088-4097, 2020.

Abstract

Random binning features, introduced in the seminal paper of Rahimi and Recht ’07, are an efficient method for approximating a kernel matrix using locality sensitive hashing. Random binning features provide a very simple and efficient way to approximate the Laplace kernel but unfortunately do not apply to many important classes of kernels, notably ones that generate smooth Gaussian processes, such as the Gaussian kernel and Matern kernel. In this paper we introduce a simple weighted version of random binning features, and show that the corresponding kernel function generates Gaussian processes of any desired smoothness. We show that our weighted random binning features provide a spectral approximation to the corresponding kernel matrix, leading to efficient algorithms for kernel ridge regression. Experiments on large scale regression datasets show that our method outperforms the accuracy of random Fourier features method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-zandieh20a, title = {Scaling up Kernel Ridge Regression via Locality Sensitive Hashing }, author = {Zandieh, Amir and Nouri, Navid and Velingker, Ameya and Kapralov, Michael and Razenshteyn, Ilya}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {4088--4097}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/zandieh20a/zandieh20a.pdf}, url = {https://proceedings.mlr.press/v108/zandieh20a.html}, abstract = {Random binning features, introduced in the seminal paper of Rahimi and Recht ’07, are an efficient method for approximating a kernel matrix using locality sensitive hashing. Random binning features provide a very simple and efficient way to approximate the Laplace kernel but unfortunately do not apply to many important classes of kernels, notably ones that generate smooth Gaussian processes, such as the Gaussian kernel and Matern kernel. In this paper we introduce a simple weighted version of random binning features, and show that the corresponding kernel function generates Gaussian processes of any desired smoothness. We show that our weighted random binning features provide a spectral approximation to the corresponding kernel matrix, leading to efficient algorithms for kernel ridge regression. Experiments on large scale regression datasets show that our method outperforms the accuracy of random Fourier features method.} }
Endnote
%0 Conference Paper %T Scaling up Kernel Ridge Regression via Locality Sensitive Hashing %A Amir Zandieh %A Navid Nouri %A Ameya Velingker %A Michael Kapralov %A Ilya Razenshteyn %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-zandieh20a %I PMLR %P 4088--4097 %U https://proceedings.mlr.press/v108/zandieh20a.html %V 108 %X Random binning features, introduced in the seminal paper of Rahimi and Recht ’07, are an efficient method for approximating a kernel matrix using locality sensitive hashing. Random binning features provide a very simple and efficient way to approximate the Laplace kernel but unfortunately do not apply to many important classes of kernels, notably ones that generate smooth Gaussian processes, such as the Gaussian kernel and Matern kernel. In this paper we introduce a simple weighted version of random binning features, and show that the corresponding kernel function generates Gaussian processes of any desired smoothness. We show that our weighted random binning features provide a spectral approximation to the corresponding kernel matrix, leading to efficient algorithms for kernel ridge regression. Experiments on large scale regression datasets show that our method outperforms the accuracy of random Fourier features method.
APA
Zandieh, A., Nouri, N., Velingker, A., Kapralov, M. & Razenshteyn, I.. (2020). Scaling up Kernel Ridge Regression via Locality Sensitive Hashing . Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:4088-4097 Available from https://proceedings.mlr.press/v108/zandieh20a.html.

Related Material