Bridging Lifelong and Multi-Task Representation Learning: An Algorithm and a Complexity Measure

Zhi Wang, Chicheng Zhang, Ramya Korlakai Vinayak
Proceedings of The 37th International Conference on Algorithmic Learning Theory, PMLR 313:1-44, 2026.

Abstract

In lifelong learning, a learner faces a sequence of tasks with shared structure and aims to identify and leverage it to accelerate learning. We study the setting where such structure is captured by a common representation of data. Unlike multi-task learning or learning-to-learn, where tasks are available upfront to learn the representation, lifelong learning requires the learner to make use of its existing knowledge while continually gathering partial information in an *online* fashion. In this paper, we consider a generalized framework of lifelong representation learning. We propose a simple algorithm that uses multi-task empirical risk minimization as a subroutine and establish a sample complexity bound based on a new notion we introduce—the *task-eluder dimension*. Our result applies to a wide range of learning problems involving general function classes. As concrete examples, we instantiate our result on classification and regression tasks under noise.

Cite this Paper


BibTeX
@InProceedings{pmlr-v313-wang26c, title = {Bridging Lifelong and Multi-Task Representation Learning: An Algorithm and a Complexity Measure}, author = {Wang, Zhi and Zhang, Chicheng and Vinayak, Ramya Korlakai}, booktitle = {Proceedings of The 37th International Conference on Algorithmic Learning Theory}, pages = {1--44}, year = {2026}, editor = {Telgarsky, Matus and Ullman, Jonathan}, volume = {313}, series = {Proceedings of Machine Learning Research}, month = {23--26 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v313/main/assets/wang26c/wang26c.pdf}, url = {https://proceedings.mlr.press/v313/wang26c.html}, abstract = {In lifelong learning, a learner faces a sequence of tasks with shared structure and aims to identify and leverage it to accelerate learning. We study the setting where such structure is captured by a common representation of data. Unlike multi-task learning or learning-to-learn, where tasks are available upfront to learn the representation, lifelong learning requires the learner to make use of its existing knowledge while continually gathering partial information in an *online* fashion. In this paper, we consider a generalized framework of lifelong representation learning. We propose a simple algorithm that uses multi-task empirical risk minimization as a subroutine and establish a sample complexity bound based on a new notion we introduce—the *task-eluder dimension*. Our result applies to a wide range of learning problems involving general function classes. As concrete examples, we instantiate our result on classification and regression tasks under noise.} }
Endnote
%0 Conference Paper %T Bridging Lifelong and Multi-Task Representation Learning: An Algorithm and a Complexity Measure %A Zhi Wang %A Chicheng Zhang %A Ramya Korlakai Vinayak %B Proceedings of The 37th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2026 %E Matus Telgarsky %E Jonathan Ullman %F pmlr-v313-wang26c %I PMLR %P 1--44 %U https://proceedings.mlr.press/v313/wang26c.html %V 313 %X In lifelong learning, a learner faces a sequence of tasks with shared structure and aims to identify and leverage it to accelerate learning. We study the setting where such structure is captured by a common representation of data. Unlike multi-task learning or learning-to-learn, where tasks are available upfront to learn the representation, lifelong learning requires the learner to make use of its existing knowledge while continually gathering partial information in an *online* fashion. In this paper, we consider a generalized framework of lifelong representation learning. We propose a simple algorithm that uses multi-task empirical risk minimization as a subroutine and establish a sample complexity bound based on a new notion we introduce—the *task-eluder dimension*. Our result applies to a wide range of learning problems involving general function classes. As concrete examples, we instantiate our result on classification and regression tasks under noise.
APA
Wang, Z., Zhang, C. & Vinayak, R.K.. (2026). Bridging Lifelong and Multi-Task Representation Learning: An Algorithm and a Complexity Measure. Proceedings of The 37th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 313:1-44 Available from https://proceedings.mlr.press/v313/wang26c.html.

Related Material