Sparse coding for multitask and transfer learning

Andreas Maurer, Massi Pontil, Bernardino Romera-Paredes
; Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):343-351, 2013.

Abstract

We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the generalization error of this approach, for both settings. Numerical experiments on one synthetic and two real datasets show the advantage of our method over single task learning, a previous method based on orthogonal and dense representation of the tasks and a related method learning task grouping.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-maurer13, title = {Sparse coding for multitask and transfer learning}, author = {Andreas Maurer and Massi Pontil and Bernardino Romera-Paredes}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {343--351}, year = {2013}, editor = {Sanjoy Dasgupta and David McAllester}, volume = {28}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/maurer13.pdf}, url = {http://proceedings.mlr.press/v28/maurer13.html}, abstract = {We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the generalization error of this approach, for both settings. Numerical experiments on one synthetic and two real datasets show the advantage of our method over single task learning, a previous method based on orthogonal and dense representation of the tasks and a related method learning task grouping.} }
Endnote
%0 Conference Paper %T Sparse coding for multitask and transfer learning %A Andreas Maurer %A Massi Pontil %A Bernardino Romera-Paredes %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-maurer13 %I PMLR %J Proceedings of Machine Learning Research %P 343--351 %U http://proceedings.mlr.press %V 28 %N 2 %W PMLR %X We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the generalization error of this approach, for both settings. Numerical experiments on one synthetic and two real datasets show the advantage of our method over single task learning, a previous method based on orthogonal and dense representation of the tasks and a related method learning task grouping.
RIS
TY - CPAPER TI - Sparse coding for multitask and transfer learning AU - Andreas Maurer AU - Massi Pontil AU - Bernardino Romera-Paredes BT - Proceedings of the 30th International Conference on Machine Learning PY - 2013/02/13 DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-maurer13 PB - PMLR SP - 343 DP - PMLR EP - 351 L1 - http://proceedings.mlr.press/v28/maurer13.pdf UR - http://proceedings.mlr.press/v28/maurer13.html AB - We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the generalization error of this approach, for both settings. Numerical experiments on one synthetic and two real datasets show the advantage of our method over single task learning, a previous method based on orthogonal and dense representation of the tasks and a related method learning task grouping. ER -
APA
Maurer, A., Pontil, M. & Romera-Paredes, B.. (2013). Sparse coding for multitask and transfer learning. Proceedings of the 30th International Conference on Machine Learning, in PMLR 28(2):343-351

Related Material