Scalable Learning in Reproducing Kernel Krein Spaces

Dino Oglic, Thomas Gärtner
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4912-4921, 2019.

Abstract

We provide the first mathematically complete derivation of the Nystr{ö}m method for low-rank approximation of indefinite kernels and propose an efficient method for finding an approximate eigendecomposition of such kernel matrices. Building on this result, we devise highly scalable methods for learning in reproducing kernel Krein spaces. The devised approaches provide a principled and theoretically well-founded means to tackle large scale learning problems with indefinite kernels. The main motivation for our work comes from problems with structured representations (e.g., graphs, strings, time-series), where it is relatively easy to devise a pairwise (dis)similarity function based on intuition and/or knowledge of domain experts. Such functions are typically not positive definite and it is often well beyond the expertise of practitioners to verify this condition. The effectiveness of the devised approaches is evaluated empirically using indefinite kernels defined on structured and vectorial data representations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-oglic19a, title = {Scalable Learning in Reproducing Kernel Krein Spaces}, author = {Oglic, Dino and G{\"a}rtner, Thomas}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4912--4921}, year = {2019}, editor = {Kamalika Chaudhuri and Ruslan Salakhutdinov}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/oglic19a/oglic19a.pdf}, url = { http://proceedings.mlr.press/v97/oglic19a.html }, abstract = {We provide the first mathematically complete derivation of the Nystr{ö}m method for low-rank approximation of indefinite kernels and propose an efficient method for finding an approximate eigendecomposition of such kernel matrices. Building on this result, we devise highly scalable methods for learning in reproducing kernel Krein spaces. The devised approaches provide a principled and theoretically well-founded means to tackle large scale learning problems with indefinite kernels. The main motivation for our work comes from problems with structured representations (e.g., graphs, strings, time-series), where it is relatively easy to devise a pairwise (dis)similarity function based on intuition and/or knowledge of domain experts. Such functions are typically not positive definite and it is often well beyond the expertise of practitioners to verify this condition. The effectiveness of the devised approaches is evaluated empirically using indefinite kernels defined on structured and vectorial data representations.} }
Endnote
%0 Conference Paper %T Scalable Learning in Reproducing Kernel Krein Spaces %A Dino Oglic %A Thomas Gärtner %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-oglic19a %I PMLR %P 4912--4921 %U http://proceedings.mlr.press/v97/oglic19a.html %V 97 %X We provide the first mathematically complete derivation of the Nystr{ö}m method for low-rank approximation of indefinite kernels and propose an efficient method for finding an approximate eigendecomposition of such kernel matrices. Building on this result, we devise highly scalable methods for learning in reproducing kernel Krein spaces. The devised approaches provide a principled and theoretically well-founded means to tackle large scale learning problems with indefinite kernels. The main motivation for our work comes from problems with structured representations (e.g., graphs, strings, time-series), where it is relatively easy to devise a pairwise (dis)similarity function based on intuition and/or knowledge of domain experts. Such functions are typically not positive definite and it is often well beyond the expertise of practitioners to verify this condition. The effectiveness of the devised approaches is evaluated empirically using indefinite kernels defined on structured and vectorial data representations.
APA
Oglic, D. & Gärtner, T.. (2019). Scalable Learning in Reproducing Kernel Krein Spaces. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4912-4921 Available from http://proceedings.mlr.press/v97/oglic19a.html .

Related Material