Random Fourier Features For Operator-Valued Kernels

Romain Brault, Markus Heinonen, Florence Buc
; Proceedings of The 8th Asian Conference on Machine Learning, PMLR 63:110-125, 2016.

Abstract

Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v63-Brault39, title = {Random Fourier Features For Operator-Valued Kernels}, author = {Romain Brault and Markus Heinonen and Florence Buc}, booktitle = {Proceedings of The 8th Asian Conference on Machine Learning}, pages = {110--125}, year = {2016}, editor = {Robert J. Durrant and Kee-Eung Kim}, volume = {63}, series = {Proceedings of Machine Learning Research}, address = {The University of Waikato, Hamilton, New Zealand}, month = {16--18 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v63/Brault39.pdf}, url = {http://proceedings.mlr.press/v63/Brault39.html}, abstract = {Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.} }
Endnote
%0 Conference Paper %T Random Fourier Features For Operator-Valued Kernels %A Romain Brault %A Markus Heinonen %A Florence Buc %B Proceedings of The 8th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Robert J. Durrant %E Kee-Eung Kim %F pmlr-v63-Brault39 %I PMLR %J Proceedings of Machine Learning Research %P 110--125 %U http://proceedings.mlr.press %V 63 %W PMLR %X Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.
RIS
TY - CPAPER TI - Random Fourier Features For Operator-Valued Kernels AU - Romain Brault AU - Markus Heinonen AU - Florence Buc BT - Proceedings of The 8th Asian Conference on Machine Learning PY - 2016/11/20 DA - 2016/11/20 ED - Robert J. Durrant ED - Kee-Eung Kim ID - pmlr-v63-Brault39 PB - PMLR SP - 110 DP - PMLR EP - 125 L1 - http://proceedings.mlr.press/v63/Brault39.pdf UR - http://proceedings.mlr.press/v63/Brault39.html AB - Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets. ER -
APA
Brault, R., Heinonen, M. & Buc, F.. (2016). Random Fourier Features For Operator-Valued Kernels. Proceedings of The 8th Asian Conference on Machine Learning, in PMLR 63:110-125

Related Material