Learning in RKHM: a C*-Algebraic Twist for Kernel Machines

Yuka Hashimoto, Masahiro Ikeda, Hachem Kadri
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:692-708, 2023.

Abstract

Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years. In this paper, we provide a new twist to this rich literature by generalizing supervised learning in RKHS and vvRKHS to reproducing kernel Hilbert C*-module (RKHM), and show how to construct effective positive-definite kernels by considering the perspective of C*-algebra. Unlike the cases of RKHS and vvRKHS, we can use C*-algebras to enlarge representation spaces. This enables us to construct RKHMs whose representation power goes beyond RKHSs, vvRKHSs, and existing methods such as convolutional neural networks. Our framework is suitable, for example, for effectively analyzing image data by allowing the interaction of Fourier components.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-hashimoto23a, title = {Learning in RKHM: a C*-Algebraic Twist for Kernel Machines}, author = {Hashimoto, Yuka and Ikeda, Masahiro and Kadri, Hachem}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {692--708}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/hashimoto23a/hashimoto23a.pdf}, url = {https://proceedings.mlr.press/v206/hashimoto23a.html}, abstract = {Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years. In this paper, we provide a new twist to this rich literature by generalizing supervised learning in RKHS and vvRKHS to reproducing kernel Hilbert C*-module (RKHM), and show how to construct effective positive-definite kernels by considering the perspective of C*-algebra. Unlike the cases of RKHS and vvRKHS, we can use C*-algebras to enlarge representation spaces. This enables us to construct RKHMs whose representation power goes beyond RKHSs, vvRKHSs, and existing methods such as convolutional neural networks. Our framework is suitable, for example, for effectively analyzing image data by allowing the interaction of Fourier components.} }
Endnote
%0 Conference Paper %T Learning in RKHM: a C*-Algebraic Twist for Kernel Machines %A Yuka Hashimoto %A Masahiro Ikeda %A Hachem Kadri %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-hashimoto23a %I PMLR %P 692--708 %U https://proceedings.mlr.press/v206/hashimoto23a.html %V 206 %X Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years. In this paper, we provide a new twist to this rich literature by generalizing supervised learning in RKHS and vvRKHS to reproducing kernel Hilbert C*-module (RKHM), and show how to construct effective positive-definite kernels by considering the perspective of C*-algebra. Unlike the cases of RKHS and vvRKHS, we can use C*-algebras to enlarge representation spaces. This enables us to construct RKHMs whose representation power goes beyond RKHSs, vvRKHSs, and existing methods such as convolutional neural networks. Our framework is suitable, for example, for effectively analyzing image data by allowing the interaction of Fourier components.
APA
Hashimoto, Y., Ikeda, M. & Kadri, H.. (2023). Learning in RKHM: a C*-Algebraic Twist for Kernel Machines. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:692-708 Available from https://proceedings.mlr.press/v206/hashimoto23a.html.

Related Material