Infinite Task Learning in RKHSs
[edit]
Proceedings of Machine Learning Research, PMLR 89:12941302, 2019.
Abstract
Machine learning has witnessed tremendous success in solving tasks depending on a single hyperparameter. When considering simultaneously a finite number of tasks, multitask learning enables one to account for the similarities of the tasks via appropriate regularizers. A step further consists of learning a continuum of tasks for various loss functions. A promising approach, called Parametric Task Learning, has paved the way in the continuum setting for affine models and piecewiselinear loss functions. In this work, we introduce a novel approach called Infinite Task Learning whose goal is to learn a function whose output is a function over the hyperparameter space. We leverage tools from operatorvalued kernels and the associated vectorvalued RKHSs that provide an explicit control over the role of the hyperparameters, and also allows us to consider new type of constraints. We provide generalization guarantees to the suggested scheme and illustrate its efficiency in costsensitive classification, quantile regression and density level set estimation.
Related Material


