Smooth Operators

Steffen Grunewalder, Gretton Arthur, John Shawe-Taylor
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):1184-1192, 2013.

Abstract

We develop a generic approach to form smooth versions of basic mathematical operations like multiplication, composition, change of measure, and conditional expectation, among others. Operations which result in functions outside the reproducing kernel Hilbert space (such as the product of two RKHS functions) are approximated via a natural cost function, such that the solution is guaranteed to be in the targeted RKHS. This approximation problem is reduced to a regression problem using an adjoint trick, and solved in a vector-valued RKHS, consisting of continuous, linear, smooth operators which map from an input, real-valued RKHS to the desired target RKHS. Important constraints, such as an almost everywhere positive density, can be enforced or approximated naturally in this framework, using convex constraints on the operators. Finally, smooth operators can be composed to accomplish more complex machine learning tasks, such as the sum rule and kernelized approximate Bayesian inference, where state-of-the-art convergence rates are obtained.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-grunewalder13, title = {Smooth Operators}, author = {Grunewalder, Steffen and Arthur, Gretton and Shawe-Taylor, John}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {1184--1192}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/grunewalder13.pdf}, url = {https://proceedings.mlr.press/v28/grunewalder13.html}, abstract = {We develop a generic approach to form smooth versions of basic mathematical operations like multiplication, composition, change of measure, and conditional expectation, among others. Operations which result in functions outside the reproducing kernel Hilbert space (such as the product of two RKHS functions) are approximated via a natural cost function, such that the solution is guaranteed to be in the targeted RKHS. This approximation problem is reduced to a regression problem using an adjoint trick, and solved in a vector-valued RKHS, consisting of continuous, linear, smooth operators which map from an input, real-valued RKHS to the desired target RKHS. Important constraints, such as an almost everywhere positive density, can be enforced or approximated naturally in this framework, using convex constraints on the operators. Finally, smooth operators can be composed to accomplish more complex machine learning tasks, such as the sum rule and kernelized approximate Bayesian inference, where state-of-the-art convergence rates are obtained.} }
Endnote
%0 Conference Paper %T Smooth Operators %A Steffen Grunewalder %A Gretton Arthur %A John Shawe-Taylor %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-grunewalder13 %I PMLR %P 1184--1192 %U https://proceedings.mlr.press/v28/grunewalder13.html %V 28 %N 3 %X We develop a generic approach to form smooth versions of basic mathematical operations like multiplication, composition, change of measure, and conditional expectation, among others. Operations which result in functions outside the reproducing kernel Hilbert space (such as the product of two RKHS functions) are approximated via a natural cost function, such that the solution is guaranteed to be in the targeted RKHS. This approximation problem is reduced to a regression problem using an adjoint trick, and solved in a vector-valued RKHS, consisting of continuous, linear, smooth operators which map from an input, real-valued RKHS to the desired target RKHS. Important constraints, such as an almost everywhere positive density, can be enforced or approximated naturally in this framework, using convex constraints on the operators. Finally, smooth operators can be composed to accomplish more complex machine learning tasks, such as the sum rule and kernelized approximate Bayesian inference, where state-of-the-art convergence rates are obtained.
RIS
TY - CPAPER TI - Smooth Operators AU - Steffen Grunewalder AU - Gretton Arthur AU - John Shawe-Taylor BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-grunewalder13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 1184 EP - 1192 L1 - http://proceedings.mlr.press/v28/grunewalder13.pdf UR - https://proceedings.mlr.press/v28/grunewalder13.html AB - We develop a generic approach to form smooth versions of basic mathematical operations like multiplication, composition, change of measure, and conditional expectation, among others. Operations which result in functions outside the reproducing kernel Hilbert space (such as the product of two RKHS functions) are approximated via a natural cost function, such that the solution is guaranteed to be in the targeted RKHS. This approximation problem is reduced to a regression problem using an adjoint trick, and solved in a vector-valued RKHS, consisting of continuous, linear, smooth operators which map from an input, real-valued RKHS to the desired target RKHS. Important constraints, such as an almost everywhere positive density, can be enforced or approximated naturally in this framework, using convex constraints on the operators. Finally, smooth operators can be composed to accomplish more complex machine learning tasks, such as the sum rule and kernelized approximate Bayesian inference, where state-of-the-art convergence rates are obtained. ER -
APA
Grunewalder, S., Arthur, G. & Shawe-Taylor, J.. (2013). Smooth Operators. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):1184-1192 Available from https://proceedings.mlr.press/v28/grunewalder13.html.

Related Material