Fast Learning in Reproducing Kernel Krein Spaces via Signed Measures

Fanghui Liu, Xiaolin Huang, Yingyi Chen, Johan Suykens
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:388-396, 2021.

Abstract

In this paper, we attempt to solve a long-lasting open question for non-positive definite (non-PD) kernels in machine learning community: can a given non-PD kernel be decomposed into the difference of two PD kernels (termed as positive decomposition)? We cast this question as a distribution view by introducing the signed measure, which transforms positive decomposition to measure decomposition: a series of non-PD kernels can be associated with the linear combination of specific finite Borel measures. In this manner, our distribution-based framework provides a sufficient and necessary condition to answer this open question. Specifically, this solution is also computationally implementable in practice to scale non-PD kernels in large sample cases, which allows us to devise the first random features algorithm to obtain an unbiased estimator. Experimental results on several benchmark datasets verify the effectiveness of our algorithm over the existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-liu21a, title = { Fast Learning in Reproducing Kernel Krein Spaces via Signed Measures }, author = {Liu, Fanghui and Huang, Xiaolin and Chen, Yingyi and Suykens, Johan}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {388--396}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/liu21a/liu21a.pdf}, url = {http://proceedings.mlr.press/v130/liu21a.html}, abstract = { In this paper, we attempt to solve a long-lasting open question for non-positive definite (non-PD) kernels in machine learning community: can a given non-PD kernel be decomposed into the difference of two PD kernels (termed as positive decomposition)? We cast this question as a distribution view by introducing the signed measure, which transforms positive decomposition to measure decomposition: a series of non-PD kernels can be associated with the linear combination of specific finite Borel measures. In this manner, our distribution-based framework provides a sufficient and necessary condition to answer this open question. Specifically, this solution is also computationally implementable in practice to scale non-PD kernels in large sample cases, which allows us to devise the first random features algorithm to obtain an unbiased estimator. Experimental results on several benchmark datasets verify the effectiveness of our algorithm over the existing methods. } }
Endnote
%0 Conference Paper %T Fast Learning in Reproducing Kernel Krein Spaces via Signed Measures %A Fanghui Liu %A Xiaolin Huang %A Yingyi Chen %A Johan Suykens %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-liu21a %I PMLR %P 388--396 %U http://proceedings.mlr.press/v130/liu21a.html %V 130 %X In this paper, we attempt to solve a long-lasting open question for non-positive definite (non-PD) kernels in machine learning community: can a given non-PD kernel be decomposed into the difference of two PD kernels (termed as positive decomposition)? We cast this question as a distribution view by introducing the signed measure, which transforms positive decomposition to measure decomposition: a series of non-PD kernels can be associated with the linear combination of specific finite Borel measures. In this manner, our distribution-based framework provides a sufficient and necessary condition to answer this open question. Specifically, this solution is also computationally implementable in practice to scale non-PD kernels in large sample cases, which allows us to devise the first random features algorithm to obtain an unbiased estimator. Experimental results on several benchmark datasets verify the effectiveness of our algorithm over the existing methods.
APA
Liu, F., Huang, X., Chen, Y. & Suykens, J.. (2021). Fast Learning in Reproducing Kernel Krein Spaces via Signed Measures . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:388-396 Available from http://proceedings.mlr.press/v130/liu21a.html.

Related Material