On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes

Kothari K. Pravesh, Livni Roi
Proceedings of the 31st International Conference on Algorithmic Learning Theory, PMLR 117:422-450, 2020.

Abstract

We study the expressive power of kernel methods and the algorithmic feasibility of multiple kernel learning for a special rich class of kernels. Specifically, we define Euclidean kernels, a diverse class that includes most, if not all, families of kernels studied in literature such as polynomial kernels and radial basis functions. We then describe the geometric and spectral structure of this family of kernels over the hypercube (and to some extent for any compact domain). Our structural results allow us to prove meaningful limitations on the expressive power of the class as well as derive several efficient algorithms for learning kernels over different domains.

Cite this Paper


BibTeX
@InProceedings{pmlr-v117-pravesh20a, title = {On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes}, author = {Pravesh, Kothari K. and Roi, Livni}, booktitle = {Proceedings of the 31st International Conference on Algorithmic Learning Theory}, pages = {422--450}, year = {2020}, editor = {Kontorovich, Aryeh and Neu, Gergely}, volume = {117}, series = {Proceedings of Machine Learning Research}, month = {08 Feb--11 Feb}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v117/pravesh20a/pravesh20a.pdf}, url = {https://proceedings.mlr.press/v117/pravesh20a.html}, abstract = {We study the expressive power of kernel methods and the algorithmic feasibility of multiple kernel learning for a special rich class of kernels. Specifically, we define Euclidean kernels, a diverse class that includes most, if not all, families of kernels studied in literature such as polynomial kernels and radial basis functions. We then describe the geometric and spectral structure of this family of kernels over the hypercube (and to some extent for any compact domain). Our structural results allow us to prove meaningful limitations on the expressive power of the class as well as derive several efficient algorithms for learning kernels over different domains.} }
Endnote
%0 Conference Paper %T On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes %A Kothari K. Pravesh %A Livni Roi %B Proceedings of the 31st International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2020 %E Aryeh Kontorovich %E Gergely Neu %F pmlr-v117-pravesh20a %I PMLR %P 422--450 %U https://proceedings.mlr.press/v117/pravesh20a.html %V 117 %X We study the expressive power of kernel methods and the algorithmic feasibility of multiple kernel learning for a special rich class of kernels. Specifically, we define Euclidean kernels, a diverse class that includes most, if not all, families of kernels studied in literature such as polynomial kernels and radial basis functions. We then describe the geometric and spectral structure of this family of kernels over the hypercube (and to some extent for any compact domain). Our structural results allow us to prove meaningful limitations on the expressive power of the class as well as derive several efficient algorithms for learning kernels over different domains.
APA
Pravesh, K.K. & Roi, L.. (2020). On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes. Proceedings of the 31st International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 117:422-450 Available from https://proceedings.mlr.press/v117/pravesh20a.html.

Related Material