Fast Sketching of Polynomial Kernels of Polynomial Degree

Zhao Song, David Woodruff, Zheng Yu, Lichen Zhang
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:9812-9823, 2021.

Abstract

Kernel methods are fundamental in machine learning, and faster algorithms for kernel approximation provide direct speedups for many core tasks in machine learning. The polynomial kernel is especially important as other kernels can often be approximated by the polynomial kernel via a Taylor series expansion. Recent techniques in oblivious sketching reduce the dependence in the running time on the degree $q$ of the polynomial kernel from exponential to polynomial, which is useful for the Gaussian kernel, for which $q$ can be chosen to be polylogarithmic. However, for more slowly growing kernels, such as the neural tangent and arc cosine kernels, $q$ needs to be polynomial, and previous work incurs a polynomial factor slowdown in the running time. We give a new oblivious sketch which greatly improves upon this running time, by removing the dependence on $q$ in the leading order term. Combined with a novel sampling scheme, we give the fastest algorithms for approximating a large family of slow-growing kernels.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-song21c, title = {Fast Sketching of Polynomial Kernels of Polynomial Degree}, author = {Song, Zhao and Woodruff, David and Yu, Zheng and Zhang, Lichen}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {9812--9823}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/song21c/song21c.pdf}, url = {https://proceedings.mlr.press/v139/song21c.html}, abstract = {Kernel methods are fundamental in machine learning, and faster algorithms for kernel approximation provide direct speedups for many core tasks in machine learning. The polynomial kernel is especially important as other kernels can often be approximated by the polynomial kernel via a Taylor series expansion. Recent techniques in oblivious sketching reduce the dependence in the running time on the degree $q$ of the polynomial kernel from exponential to polynomial, which is useful for the Gaussian kernel, for which $q$ can be chosen to be polylogarithmic. However, for more slowly growing kernels, such as the neural tangent and arc cosine kernels, $q$ needs to be polynomial, and previous work incurs a polynomial factor slowdown in the running time. We give a new oblivious sketch which greatly improves upon this running time, by removing the dependence on $q$ in the leading order term. Combined with a novel sampling scheme, we give the fastest algorithms for approximating a large family of slow-growing kernels.} }
Endnote
%0 Conference Paper %T Fast Sketching of Polynomial Kernels of Polynomial Degree %A Zhao Song %A David Woodruff %A Zheng Yu %A Lichen Zhang %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-song21c %I PMLR %P 9812--9823 %U https://proceedings.mlr.press/v139/song21c.html %V 139 %X Kernel methods are fundamental in machine learning, and faster algorithms for kernel approximation provide direct speedups for many core tasks in machine learning. The polynomial kernel is especially important as other kernels can often be approximated by the polynomial kernel via a Taylor series expansion. Recent techniques in oblivious sketching reduce the dependence in the running time on the degree $q$ of the polynomial kernel from exponential to polynomial, which is useful for the Gaussian kernel, for which $q$ can be chosen to be polylogarithmic. However, for more slowly growing kernels, such as the neural tangent and arc cosine kernels, $q$ needs to be polynomial, and previous work incurs a polynomial factor slowdown in the running time. We give a new oblivious sketch which greatly improves upon this running time, by removing the dependence on $q$ in the leading order term. Combined with a novel sampling scheme, we give the fastest algorithms for approximating a large family of slow-growing kernels.
APA
Song, Z., Woodruff, D., Yu, Z. & Zhang, L.. (2021). Fast Sketching of Polynomial Kernels of Polynomial Degree. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:9812-9823 Available from https://proceedings.mlr.press/v139/song21c.html.

Related Material