Optimality Implies Kernel Sum Classifiers are Statistically Efficient

Raphael Meyer, Jean Honorio
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4566-4574, 2019.

Abstract

We propose a novel combination of optimization tools with learning theory bounds in order to analyze the sample complexity of optimal kernel sum classifiers. This contrasts the typical learning theoretic results which hold for all (potentially suboptimal) classifiers. Our work also justifies assumptions made in prior work on multiple kernel learning. As a byproduct of our analysis, we also provide a new form of Rademacher complexity for hypothesis classes containing only optimal classifiers.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-meyer19a, title = {Optimality Implies Kernel Sum Classifiers are Statistically Efficient}, author = {Meyer, Raphael and Honorio, Jean}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4566--4574}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/meyer19a/meyer19a.pdf}, url = {https://proceedings.mlr.press/v97/meyer19a.html}, abstract = {We propose a novel combination of optimization tools with learning theory bounds in order to analyze the sample complexity of optimal kernel sum classifiers. This contrasts the typical learning theoretic results which hold for all (potentially suboptimal) classifiers. Our work also justifies assumptions made in prior work on multiple kernel learning. As a byproduct of our analysis, we also provide a new form of Rademacher complexity for hypothesis classes containing only optimal classifiers.} }
Endnote
%0 Conference Paper %T Optimality Implies Kernel Sum Classifiers are Statistically Efficient %A Raphael Meyer %A Jean Honorio %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-meyer19a %I PMLR %P 4566--4574 %U https://proceedings.mlr.press/v97/meyer19a.html %V 97 %X We propose a novel combination of optimization tools with learning theory bounds in order to analyze the sample complexity of optimal kernel sum classifiers. This contrasts the typical learning theoretic results which hold for all (potentially suboptimal) classifiers. Our work also justifies assumptions made in prior work on multiple kernel learning. As a byproduct of our analysis, we also provide a new form of Rademacher complexity for hypothesis classes containing only optimal classifiers.
APA
Meyer, R. & Honorio, J.. (2019). Optimality Implies Kernel Sum Classifiers are Statistically Efficient. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4566-4574 Available from https://proceedings.mlr.press/v97/meyer19a.html.

Related Material