Enhancing Plasticity for First Session Adaptation Continual Learning

Imad Eddine Marouf, Subhankar Roy, Stéphane Lathuilière, Enzo Tartaglione
Proceedings of The 4th Conference on Lifelong Learning Agents, PMLR 330:1-14, 2026.

Abstract

The integration of large pre-trained models (PTMs) into Class-Incremental Learning (CIL) has fa- cilitated the development of computationally efficient strategies such as First-Session Adaptation (FSA), which fine-tunes the model solely on the first task while keeping it frozen for subsequent tasks. Although effective in homogeneous task sequences, these approaches struggle when faced with the heterogeneity of real-world task distributions. We introduce Plasticity-Enhanced Test-Time Adaptation in Class-Incremental Learning (PLASTIC), a method that reinstates plasticity in CIL while preserving model stability. PLASTIC leverages Test-Time Adaptation (TTA) by dynamically fine-tuning LayerNorm parameters on unlabeled test data, enabling adaptability to evolving tasks and improving robustness against data corruption. To prevent TTA-induced model divergence and maintain stable learning across tasks, we introduce a teacher-student distillation framework, ensur- ing that adaptation remains controlled and generalizable. Extensive experiments across multiple benchmarks demonstrate that PLASTIC consistently outperforms both conventional and state-of- the-art PTM-based CIL approaches, while also exhibiting inherent robustness to data corruptions. Code is available at: https://github.com/IemProg/PLASTIC

Cite this Paper


BibTeX
@InProceedings{pmlr-v330-marouf26a, title = {Enhancing Plasticity for First Session Adaptation Continual Learning}, author = {Marouf, Imad Eddine and Roy, Subhankar and Lathuili\`{e}re, St\'{e}phane and Tartaglione, Enzo}, booktitle = {Proceedings of The 4th Conference on Lifelong Learning Agents}, pages = {1--14}, year = {2026}, editor = {Chandar, Sarath and Pascanu, Razvan and Eaton, Eric and Liu, Bing and Mahmood, Rupam and Rannen-Triki, Amal}, volume = {330}, series = {Proceedings of Machine Learning Research}, month = {11--14 Aug}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v330/main/assets/marouf26a/marouf26a.pdf}, url = {https://proceedings.mlr.press/v330/marouf26a.html}, abstract = {The integration of large pre-trained models (PTMs) into Class-Incremental Learning (CIL) has fa- cilitated the development of computationally efficient strategies such as First-Session Adaptation (FSA), which fine-tunes the model solely on the first task while keeping it frozen for subsequent tasks. Although effective in homogeneous task sequences, these approaches struggle when faced with the heterogeneity of real-world task distributions. We introduce Plasticity-Enhanced Test-Time Adaptation in Class-Incremental Learning (PLASTIC), a method that reinstates plasticity in CIL while preserving model stability. PLASTIC leverages Test-Time Adaptation (TTA) by dynamically fine-tuning LayerNorm parameters on unlabeled test data, enabling adaptability to evolving tasks and improving robustness against data corruption. To prevent TTA-induced model divergence and maintain stable learning across tasks, we introduce a teacher-student distillation framework, ensur- ing that adaptation remains controlled and generalizable. Extensive experiments across multiple benchmarks demonstrate that PLASTIC consistently outperforms both conventional and state-of- the-art PTM-based CIL approaches, while also exhibiting inherent robustness to data corruptions. Code is available at: https://github.com/IemProg/PLASTIC} }
Endnote
%0 Conference Paper %T Enhancing Plasticity for First Session Adaptation Continual Learning %A Imad Eddine Marouf %A Subhankar Roy %A Stéphane Lathuilière %A Enzo Tartaglione %B Proceedings of The 4th Conference on Lifelong Learning Agents %C Proceedings of Machine Learning Research %D 2026 %E Sarath Chandar %E Razvan Pascanu %E Eric Eaton %E Bing Liu %E Rupam Mahmood %E Amal Rannen-Triki %F pmlr-v330-marouf26a %I PMLR %P 1--14 %U https://proceedings.mlr.press/v330/marouf26a.html %V 330 %X The integration of large pre-trained models (PTMs) into Class-Incremental Learning (CIL) has fa- cilitated the development of computationally efficient strategies such as First-Session Adaptation (FSA), which fine-tunes the model solely on the first task while keeping it frozen for subsequent tasks. Although effective in homogeneous task sequences, these approaches struggle when faced with the heterogeneity of real-world task distributions. We introduce Plasticity-Enhanced Test-Time Adaptation in Class-Incremental Learning (PLASTIC), a method that reinstates plasticity in CIL while preserving model stability. PLASTIC leverages Test-Time Adaptation (TTA) by dynamically fine-tuning LayerNorm parameters on unlabeled test data, enabling adaptability to evolving tasks and improving robustness against data corruption. To prevent TTA-induced model divergence and maintain stable learning across tasks, we introduce a teacher-student distillation framework, ensur- ing that adaptation remains controlled and generalizable. Extensive experiments across multiple benchmarks demonstrate that PLASTIC consistently outperforms both conventional and state-of- the-art PTM-based CIL approaches, while also exhibiting inherent robustness to data corruptions. Code is available at: https://github.com/IemProg/PLASTIC
APA
Marouf, I.E., Roy, S., Lathuilière, S. & Tartaglione, E.. (2026). Enhancing Plasticity for First Session Adaptation Continual Learning. Proceedings of The 4th Conference on Lifelong Learning Agents, in Proceedings of Machine Learning Research 330:1-14 Available from https://proceedings.mlr.press/v330/marouf26a.html.

Related Material