Infinite Task Learning in RKHSs

Romain Brault, Alex Lambert, Zoltan Szabo, Maxime Sangnier, Florence d’Alche-Buc
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:1294-1302, 2019.

Abstract

Machine learning has witnessed tremendous success in solving tasks depending on a single hyperparameter. When considering simultaneously a finite number of tasks, multi-task learning enables one to account for the similarities of the tasks via appropriate regularizers. A step further consists of learning a continuum of tasks for various loss functions. A promising approach, called Parametric Task Learning, has paved the way in the continuum setting for affine models and piecewise-linear loss functions. In this work, we introduce a novel approach called Infinite Task Learning whose goal is to learn a function whose output is a function over the hyperparameter space. We leverage tools from operator-valued kernels and the associated vector-valued RKHSs that provide an explicit control over the role of the hyperparameters, and also allows us to consider new type of constraints. We provide generalization guarantees to the suggested scheme and illustrate its efficiency in cost-sensitive classification, quantile regression and density level set estimation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-brault19a, title = {Infinite Task Learning in RKHSs}, author = {Brault, Romain and Lambert, Alex and Szabo, Zoltan and Sangnier, Maxime and d'Alche-Buc, Florence}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {1294--1302}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/brault19a/brault19a.pdf}, url = {https://proceedings.mlr.press/v89/brault19a.html}, abstract = {Machine learning has witnessed tremendous success in solving tasks depending on a single hyperparameter. When considering simultaneously a finite number of tasks, multi-task learning enables one to account for the similarities of the tasks via appropriate regularizers. A step further consists of learning a continuum of tasks for various loss functions. A promising approach, called Parametric Task Learning, has paved the way in the continuum setting for affine models and piecewise-linear loss functions. In this work, we introduce a novel approach called Infinite Task Learning whose goal is to learn a function whose output is a function over the hyperparameter space. We leverage tools from operator-valued kernels and the associated vector-valued RKHSs that provide an explicit control over the role of the hyperparameters, and also allows us to consider new type of constraints. We provide generalization guarantees to the suggested scheme and illustrate its efficiency in cost-sensitive classification, quantile regression and density level set estimation.} }
Endnote
%0 Conference Paper %T Infinite Task Learning in RKHSs %A Romain Brault %A Alex Lambert %A Zoltan Szabo %A Maxime Sangnier %A Florence d’Alche-Buc %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-brault19a %I PMLR %P 1294--1302 %U https://proceedings.mlr.press/v89/brault19a.html %V 89 %X Machine learning has witnessed tremendous success in solving tasks depending on a single hyperparameter. When considering simultaneously a finite number of tasks, multi-task learning enables one to account for the similarities of the tasks via appropriate regularizers. A step further consists of learning a continuum of tasks for various loss functions. A promising approach, called Parametric Task Learning, has paved the way in the continuum setting for affine models and piecewise-linear loss functions. In this work, we introduce a novel approach called Infinite Task Learning whose goal is to learn a function whose output is a function over the hyperparameter space. We leverage tools from operator-valued kernels and the associated vector-valued RKHSs that provide an explicit control over the role of the hyperparameters, and also allows us to consider new type of constraints. We provide generalization guarantees to the suggested scheme and illustrate its efficiency in cost-sensitive classification, quantile regression and density level set estimation.
APA
Brault, R., Lambert, A., Szabo, Z., Sangnier, M. & d’Alche-Buc, F.. (2019). Infinite Task Learning in RKHSs. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:1294-1302 Available from https://proceedings.mlr.press/v89/brault19a.html.

Related Material