A PAC-Bayesian bound for Lifelong Learning

Anastasia Pentina, Christoph Lampert
; Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):991-999, 2014.

Abstract

Transfer learning has received a lot of attention in the machine learning community over the last years, and several effective algorithms have been developed. However, relatively little is known about their theoretical properties, especially in the setting of lifelong learning, where the goal is to transfer information to tasks for which no data have been observed so far. In this work we study lifelong learning from a theoretical perspective. Our main result is a PAC-Bayesian generalization bound that offers a unified view on existing paradigms for transfer learning, such as the transfer of parameters or the transfer of low-dimensional representations. We also use the bound to derive two principled lifelong learning algorithms, and we show that these yield results comparable with existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-pentina14, title = {A PAC-Bayesian bound for Lifelong Learning}, author = {Anastasia Pentina and Christoph Lampert}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {991--999}, year = {2014}, editor = {Eric P. Xing and Tony Jebara}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/pentina14.pdf}, url = {http://proceedings.mlr.press/v32/pentina14.html}, abstract = {Transfer learning has received a lot of attention in the machine learning community over the last years, and several effective algorithms have been developed. However, relatively little is known about their theoretical properties, especially in the setting of lifelong learning, where the goal is to transfer information to tasks for which no data have been observed so far. In this work we study lifelong learning from a theoretical perspective. Our main result is a PAC-Bayesian generalization bound that offers a unified view on existing paradigms for transfer learning, such as the transfer of parameters or the transfer of low-dimensional representations. We also use the bound to derive two principled lifelong learning algorithms, and we show that these yield results comparable with existing methods.} }
Endnote
%0 Conference Paper %T A PAC-Bayesian bound for Lifelong Learning %A Anastasia Pentina %A Christoph Lampert %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-pentina14 %I PMLR %J Proceedings of Machine Learning Research %P 991--999 %U http://proceedings.mlr.press %V 32 %N 2 %W PMLR %X Transfer learning has received a lot of attention in the machine learning community over the last years, and several effective algorithms have been developed. However, relatively little is known about their theoretical properties, especially in the setting of lifelong learning, where the goal is to transfer information to tasks for which no data have been observed so far. In this work we study lifelong learning from a theoretical perspective. Our main result is a PAC-Bayesian generalization bound that offers a unified view on existing paradigms for transfer learning, such as the transfer of parameters or the transfer of low-dimensional representations. We also use the bound to derive two principled lifelong learning algorithms, and we show that these yield results comparable with existing methods.
RIS
TY - CPAPER TI - A PAC-Bayesian bound for Lifelong Learning AU - Anastasia Pentina AU - Christoph Lampert BT - Proceedings of the 31st International Conference on Machine Learning PY - 2014/01/27 DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-pentina14 PB - PMLR SP - 991 DP - PMLR EP - 999 L1 - http://proceedings.mlr.press/v32/pentina14.pdf UR - http://proceedings.mlr.press/v32/pentina14.html AB - Transfer learning has received a lot of attention in the machine learning community over the last years, and several effective algorithms have been developed. However, relatively little is known about their theoretical properties, especially in the setting of lifelong learning, where the goal is to transfer information to tasks for which no data have been observed so far. In this work we study lifelong learning from a theoretical perspective. Our main result is a PAC-Bayesian generalization bound that offers a unified view on existing paradigms for transfer learning, such as the transfer of parameters or the transfer of low-dimensional representations. We also use the bound to derive two principled lifelong learning algorithms, and we show that these yield results comparable with existing methods. ER -
APA
Pentina, A. & Lampert, C.. (2014). A PAC-Bayesian bound for Lifelong Learning. Proceedings of the 31st International Conference on Machine Learning, in PMLR 32(2):991-999

Related Material