Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees

Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:253-262, 2017.

Abstract

Random Fourier features is one of the most popular techniques for scaling up kernel methods, such as kernel ridge regression. However, despite impressive empirical results, the statistical properties of random Fourier features are still not well understood. In this paper we take steps toward filling this gap. Specifically, we approach random Fourier features from a spectral matrix approximation point of view, give tight bounds on the number of Fourier features required to achieve a spectral approximation, and show how spectral matrix approximation bounds imply statistical guarantees for kernel ridge regression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-avron17a, title = {Random {F}ourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees}, author = {Haim Avron and Michael Kapralov and Cameron Musco and Christopher Musco and Ameya Velingker and Amir Zandieh}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {253--262}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/avron17a/avron17a.pdf}, url = {https://proceedings.mlr.press/v70/avron17a.html}, abstract = {Random Fourier features is one of the most popular techniques for scaling up kernel methods, such as kernel ridge regression. However, despite impressive empirical results, the statistical properties of random Fourier features are still not well understood. In this paper we take steps toward filling this gap. Specifically, we approach random Fourier features from a spectral matrix approximation point of view, give tight bounds on the number of Fourier features required to achieve a spectral approximation, and show how spectral matrix approximation bounds imply statistical guarantees for kernel ridge regression.} }
Endnote
%0 Conference Paper %T Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees %A Haim Avron %A Michael Kapralov %A Cameron Musco %A Christopher Musco %A Ameya Velingker %A Amir Zandieh %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-avron17a %I PMLR %P 253--262 %U https://proceedings.mlr.press/v70/avron17a.html %V 70 %X Random Fourier features is one of the most popular techniques for scaling up kernel methods, such as kernel ridge regression. However, despite impressive empirical results, the statistical properties of random Fourier features are still not well understood. In this paper we take steps toward filling this gap. Specifically, we approach random Fourier features from a spectral matrix approximation point of view, give tight bounds on the number of Fourier features required to achieve a spectral approximation, and show how spectral matrix approximation bounds imply statistical guarantees for kernel ridge regression.
APA
Avron, H., Kapralov, M., Musco, C., Musco, C., Velingker, A. & Zandieh, A.. (2017). Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:253-262 Available from https://proceedings.mlr.press/v70/avron17a.html.

Related Material