PlaStIL: Plastic and Stable Exemplar-Free Class-Incremental Learning

Grégoire Petit, Adrian Popescu, Eden Belouadah, David Picard, Bertrand Delezoide
Proceedings of The 2nd Conference on Lifelong Learning Agents, PMLR 232:399-414, 2023.

Abstract

Plasticity and stability are needed in class-incremental learning in order to learn from new data while preserving past knowledge. Due to catastrophic forgetting, finding a compromise between these two properties is particularly challenging when no memory buffer is available. Mainstream methods need to store two deep models since they integrate new classes using fine tuning with knowledge distillation from the previous incremental state. We propose a method which has similar number of parameters but distributes them differently in order to find a better balance between plasticity and stability. Following an approach already deployed by transfer-based incremental methods, we freeze the feature extractor after the initial state. Classes in the oldest incremental states are trained with this frozen extractor to ensure stability. Recent classes are predicted using partially fine-tuned models in order to introduce plasticity. Our proposed plasticity layer can be incorporated to any transfer-based method designed for exemplar-free incremental learning, and we apply it to two such methods. Evaluation is done with three large-scale datasets. Results show that performance gains are obtained in all tested configurations compared to existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v232-petit23a, title = {PlaStIL: Plastic and Stable Exemplar-Free Class-Incremental Learning}, author = {Petit, Gr\'egoire and Popescu, Adrian and Belouadah, Eden and Picard, David and Delezoide, Bertrand}, booktitle = {Proceedings of The 2nd Conference on Lifelong Learning Agents}, pages = {399--414}, year = {2023}, editor = {Chandar, Sarath and Pascanu, Razvan and Sedghi, Hanie and Precup, Doina}, volume = {232}, series = {Proceedings of Machine Learning Research}, month = {22--25 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v232/petit23a/petit23a.pdf}, url = {https://proceedings.mlr.press/v232/petit23a.html}, abstract = {Plasticity and stability are needed in class-incremental learning in order to learn from new data while preserving past knowledge. Due to catastrophic forgetting, finding a compromise between these two properties is particularly challenging when no memory buffer is available. Mainstream methods need to store two deep models since they integrate new classes using fine tuning with knowledge distillation from the previous incremental state. We propose a method which has similar number of parameters but distributes them differently in order to find a better balance between plasticity and stability. Following an approach already deployed by transfer-based incremental methods, we freeze the feature extractor after the initial state. Classes in the oldest incremental states are trained with this frozen extractor to ensure stability. Recent classes are predicted using partially fine-tuned models in order to introduce plasticity. Our proposed plasticity layer can be incorporated to any transfer-based method designed for exemplar-free incremental learning, and we apply it to two such methods. Evaluation is done with three large-scale datasets. Results show that performance gains are obtained in all tested configurations compared to existing methods. } }
Endnote
%0 Conference Paper %T PlaStIL: Plastic and Stable Exemplar-Free Class-Incremental Learning %A Grégoire Petit %A Adrian Popescu %A Eden Belouadah %A David Picard %A Bertrand Delezoide %B Proceedings of The 2nd Conference on Lifelong Learning Agents %C Proceedings of Machine Learning Research %D 2023 %E Sarath Chandar %E Razvan Pascanu %E Hanie Sedghi %E Doina Precup %F pmlr-v232-petit23a %I PMLR %P 399--414 %U https://proceedings.mlr.press/v232/petit23a.html %V 232 %X Plasticity and stability are needed in class-incremental learning in order to learn from new data while preserving past knowledge. Due to catastrophic forgetting, finding a compromise between these two properties is particularly challenging when no memory buffer is available. Mainstream methods need to store two deep models since they integrate new classes using fine tuning with knowledge distillation from the previous incremental state. We propose a method which has similar number of parameters but distributes them differently in order to find a better balance between plasticity and stability. Following an approach already deployed by transfer-based incremental methods, we freeze the feature extractor after the initial state. Classes in the oldest incremental states are trained with this frozen extractor to ensure stability. Recent classes are predicted using partially fine-tuned models in order to introduce plasticity. Our proposed plasticity layer can be incorporated to any transfer-based method designed for exemplar-free incremental learning, and we apply it to two such methods. Evaluation is done with three large-scale datasets. Results show that performance gains are obtained in all tested configurations compared to existing methods.
APA
Petit, G., Popescu, A., Belouadah, E., Picard, D. & Delezoide, B.. (2023). PlaStIL: Plastic and Stable Exemplar-Free Class-Incremental Learning. Proceedings of The 2nd Conference on Lifelong Learning Agents, in Proceedings of Machine Learning Research 232:399-414 Available from https://proceedings.mlr.press/v232/petit23a.html.

Related Material