Distributed Multi-Task Learning

Jialei Wang, Mladen Kolar, Nathan Srerbo
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:751-760, 2016.

Abstract

We consider the problem of distributed multi-task learning, where each machine learns a separate, but related, task. Specifically, each machine learns a linear predictor in high-dimensional space, where all tasks share the same small support. We present a communication-efficient estimator based on the debiased lasso and show that it is comparable with the optimal centralized method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-wang16d, title = {Distributed Multi-Task Learning}, author = {Wang, Jialei and Kolar, Mladen and Srerbo, Nathan}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {751--760}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/wang16d.pdf}, url = {https://proceedings.mlr.press/v51/wang16d.html}, abstract = {We consider the problem of distributed multi-task learning, where each machine learns a separate, but related, task. Specifically, each machine learns a linear predictor in high-dimensional space, where all tasks share the same small support. We present a communication-efficient estimator based on the debiased lasso and show that it is comparable with the optimal centralized method.} }
Endnote
%0 Conference Paper %T Distributed Multi-Task Learning %A Jialei Wang %A Mladen Kolar %A Nathan Srerbo %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-wang16d %I PMLR %P 751--760 %U https://proceedings.mlr.press/v51/wang16d.html %V 51 %X We consider the problem of distributed multi-task learning, where each machine learns a separate, but related, task. Specifically, each machine learns a linear predictor in high-dimensional space, where all tasks share the same small support. We present a communication-efficient estimator based on the debiased lasso and show that it is comparable with the optimal centralized method.
RIS
TY - CPAPER TI - Distributed Multi-Task Learning AU - Jialei Wang AU - Mladen Kolar AU - Nathan Srerbo BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-wang16d PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 751 EP - 760 L1 - http://proceedings.mlr.press/v51/wang16d.pdf UR - https://proceedings.mlr.press/v51/wang16d.html AB - We consider the problem of distributed multi-task learning, where each machine learns a separate, but related, task. Specifically, each machine learns a linear predictor in high-dimensional space, where all tasks share the same small support. We present a communication-efficient estimator based on the debiased lasso and show that it is comparable with the optimal centralized method. ER -
APA
Wang, J., Kolar, M. & Srerbo, N.. (2016). Distributed Multi-Task Learning. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:751-760 Available from https://proceedings.mlr.press/v51/wang16d.html.

Related Material