Efficient Multitask Feature and Relationship Learning

Han Zhao, Otilia Stretcu, Alexander J. Smola, Geoffrey J. Gordon
Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, PMLR 115:777-787, 2020.

Abstract

We consider a multitask learning problem, in which several predictors are learned jointly. Prior research has shown that learning the relations between tasks, and between the input features, together with the predictor, can lead to better generalization and interpretability, which proved to be useful for applications in many domains. In this paper, we consider a formulation of multitask learning that learns the relationships both between tasks and between features, represented through a task covariance and a feature covariance matrix, respectively. First, we demonstrate that existing methods proposed for this problem present an issue that may lead to ill-posed optimization. We then propose an alternative formulation, as well as an efficient algorithm to optimize it. Using ideas from optimization and graph theory, we propose an efficient coordinate-wise minimization algorithm that has a closed form solution for each block subproblem. Our experiments show that the proposed optimization method is orders of magnitude faster than its competitors. We also provide a nonlinear extension that is able to achieve better generalization than existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v115-zhao20a, title = {Efficient Multitask Feature and Relationship Learning}, author = {Zhao, Han and Stretcu, Otilia and Smola, Alexander J. and Gordon, Geoffrey J.}, booktitle = {Proceedings of The 35th Uncertainty in Artificial Intelligence Conference}, pages = {777--787}, year = {2020}, editor = {Adams, Ryan P. and Gogate, Vibhav}, volume = {115}, series = {Proceedings of Machine Learning Research}, month = {22--25 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v115/zhao20a/zhao20a.pdf}, url = {https://proceedings.mlr.press/v115/zhao20a.html}, abstract = {We consider a multitask learning problem, in which several predictors are learned jointly. Prior research has shown that learning the relations between tasks, and between the input features, together with the predictor, can lead to better generalization and interpretability, which proved to be useful for applications in many domains. In this paper, we consider a formulation of multitask learning that learns the relationships both between tasks and between features, represented through a task covariance and a feature covariance matrix, respectively. First, we demonstrate that existing methods proposed for this problem present an issue that may lead to ill-posed optimization. We then propose an alternative formulation, as well as an efficient algorithm to optimize it. Using ideas from optimization and graph theory, we propose an efficient coordinate-wise minimization algorithm that has a closed form solution for each block subproblem. Our experiments show that the proposed optimization method is orders of magnitude faster than its competitors. We also provide a nonlinear extension that is able to achieve better generalization than existing methods.} }
Endnote
%0 Conference Paper %T Efficient Multitask Feature and Relationship Learning %A Han Zhao %A Otilia Stretcu %A Alexander J. Smola %A Geoffrey J. Gordon %B Proceedings of The 35th Uncertainty in Artificial Intelligence Conference %C Proceedings of Machine Learning Research %D 2020 %E Ryan P. Adams %E Vibhav Gogate %F pmlr-v115-zhao20a %I PMLR %P 777--787 %U https://proceedings.mlr.press/v115/zhao20a.html %V 115 %X We consider a multitask learning problem, in which several predictors are learned jointly. Prior research has shown that learning the relations between tasks, and between the input features, together with the predictor, can lead to better generalization and interpretability, which proved to be useful for applications in many domains. In this paper, we consider a formulation of multitask learning that learns the relationships both between tasks and between features, represented through a task covariance and a feature covariance matrix, respectively. First, we demonstrate that existing methods proposed for this problem present an issue that may lead to ill-posed optimization. We then propose an alternative formulation, as well as an efficient algorithm to optimize it. Using ideas from optimization and graph theory, we propose an efficient coordinate-wise minimization algorithm that has a closed form solution for each block subproblem. Our experiments show that the proposed optimization method is orders of magnitude faster than its competitors. We also provide a nonlinear extension that is able to achieve better generalization than existing methods.
APA
Zhao, H., Stretcu, O., Smola, A.J. & Gordon, G.J.. (2020). Efficient Multitask Feature and Relationship Learning. Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, in Proceedings of Machine Learning Research 115:777-787 Available from https://proceedings.mlr.press/v115/zhao20a.html.

Related Material