Multitask Metric Learning: Theory and Algorithm

Boyu Wang, Hejia Zhang, Peng Liu, Zebang Shen, Joelle Pineau
; Proceedings of Machine Learning Research, PMLR 89:3362-3371, 2019.

Abstract

In this paper, we study the problem of multitask metric learning (mtML). We first examine the generalization bound of the regularized mtML formulation based on the notion of algorithmic stability, proving the convergence rate of mtML and revealing the trade-off between the tasks. Moreover, we also establish the theoretical connection between the mtML, single-task learning and pooling-task learning approaches. In addition, we present a novel boosting-based mtML (mt-BML) algorithm, which scales well with the feature dimension of the data. Finally, we also devise an efficient second-order Riemannian retraction operator which is tailored specifically to our mt-BML algorithm. It produces a low-rank solution of mtML to reduce the model complexity, and may also improve generalization performances. Extensive evaluations on several benchmark data sets verify the effectiveness of our learning algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-wang19f, title = {Multitask Metric Learning: Theory and Algorithm}, author = {Wang, Boyu and Zhang, Hejia and Liu, Peng and Shen, Zebang and Pineau, Joelle}, booktitle = {Proceedings of Machine Learning Research}, pages = {3362--3371}, year = {2019}, editor = {Kamalika Chaudhuri and Masashi Sugiyama}, volume = {89}, series = {Proceedings of Machine Learning Research}, address = {}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/wang19f/wang19f.pdf}, url = {http://proceedings.mlr.press/v89/wang19f.html}, abstract = {In this paper, we study the problem of multitask metric learning (mtML). We first examine the generalization bound of the regularized mtML formulation based on the notion of algorithmic stability, proving the convergence rate of mtML and revealing the trade-off between the tasks. Moreover, we also establish the theoretical connection between the mtML, single-task learning and pooling-task learning approaches. In addition, we present a novel boosting-based mtML (mt-BML) algorithm, which scales well with the feature dimension of the data. Finally, we also devise an efficient second-order Riemannian retraction operator which is tailored specifically to our mt-BML algorithm. It produces a low-rank solution of mtML to reduce the model complexity, and may also improve generalization performances. Extensive evaluations on several benchmark data sets verify the effectiveness of our learning algorithm.} }
Endnote
%0 Conference Paper %T Multitask Metric Learning: Theory and Algorithm %A Boyu Wang %A Hejia Zhang %A Peng Liu %A Zebang Shen %A Joelle Pineau %B Proceedings of Machine Learning Research %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-wang19f %I PMLR %J Proceedings of Machine Learning Research %P 3362--3371 %U http://proceedings.mlr.press %V 89 %W PMLR %X In this paper, we study the problem of multitask metric learning (mtML). We first examine the generalization bound of the regularized mtML formulation based on the notion of algorithmic stability, proving the convergence rate of mtML and revealing the trade-off between the tasks. Moreover, we also establish the theoretical connection between the mtML, single-task learning and pooling-task learning approaches. In addition, we present a novel boosting-based mtML (mt-BML) algorithm, which scales well with the feature dimension of the data. Finally, we also devise an efficient second-order Riemannian retraction operator which is tailored specifically to our mt-BML algorithm. It produces a low-rank solution of mtML to reduce the model complexity, and may also improve generalization performances. Extensive evaluations on several benchmark data sets verify the effectiveness of our learning algorithm.
APA
Wang, B., Zhang, H., Liu, P., Shen, Z. & Pineau, J.. (2019). Multitask Metric Learning: Theory and Algorithm. Proceedings of Machine Learning Research, in PMLR 89:3362-3371

Related Material