Spurious Features in Continual Learning

Timothée Lesort
Proceedings of The First AAAI Bridge Program on Continual Causality, PMLR 208:59-62, 2023.

Abstract

Continual Learning (CL) is the research field addressing training settings where the data distribution is not static. One of the core problems CL addresses is learning without forgetting. To solve problems, continual learning algorithms need to learn robust and stable representations based only on a subset of the data. Those representations are necessarily biased and should be revisited when new data becomes available. This paper studies spurious features’ influence on continual learning algorithms. We show that in continual learning, algorithms have to deal with local spurious features that correlate well with labels within a task only but which are not good representations for the concept to learn. One of the big challenges of continual learning algorithms is to discover causal relationships between features and labels under distribution shifts.

Cite this Paper


BibTeX
@InProceedings{pmlr-v208-lesort23a, title = {Spurious Features in Continual Learning}, author = {Lesort, Timoth{\'e}e}, booktitle = {Proceedings of The First AAAI Bridge Program on Continual Causality}, pages = {59--62}, year = {2023}, editor = {Mundt, Martin and Cooper, Keiland W. and Dhami, Devendra Singh and Ribeiro, Adéle and Smith, James Seale and Bellot, Alexis and Hayes, Tyler}, volume = {208}, series = {Proceedings of Machine Learning Research}, month = {07--08 Feb}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v208/lesort23a/lesort23a.pdf}, url = {https://proceedings.mlr.press/v208/lesort23a.html}, abstract = {Continual Learning (CL) is the research field addressing training settings where the data distribution is not static. One of the core problems CL addresses is learning without forgetting. To solve problems, continual learning algorithms need to learn robust and stable representations based only on a subset of the data. Those representations are necessarily biased and should be revisited when new data becomes available. This paper studies spurious features’ influence on continual learning algorithms. We show that in continual learning, algorithms have to deal with local spurious features that correlate well with labels within a task only but which are not good representations for the concept to learn. One of the big challenges of continual learning algorithms is to discover causal relationships between features and labels under distribution shifts.} }
Endnote
%0 Conference Paper %T Spurious Features in Continual Learning %A Timothée Lesort %B Proceedings of The First AAAI Bridge Program on Continual Causality %C Proceedings of Machine Learning Research %D 2023 %E Martin Mundt %E Keiland W. Cooper %E Devendra Singh Dhami %E Adéle Ribeiro %E James Seale Smith %E Alexis Bellot %E Tyler Hayes %F pmlr-v208-lesort23a %I PMLR %P 59--62 %U https://proceedings.mlr.press/v208/lesort23a.html %V 208 %X Continual Learning (CL) is the research field addressing training settings where the data distribution is not static. One of the core problems CL addresses is learning without forgetting. To solve problems, continual learning algorithms need to learn robust and stable representations based only on a subset of the data. Those representations are necessarily biased and should be revisited when new data becomes available. This paper studies spurious features’ influence on continual learning algorithms. We show that in continual learning, algorithms have to deal with local spurious features that correlate well with labels within a task only but which are not good representations for the concept to learn. One of the big challenges of continual learning algorithms is to discover causal relationships between features and labels under distribution shifts.
APA
Lesort, T.. (2023). Spurious Features in Continual Learning. Proceedings of The First AAAI Bridge Program on Continual Causality, in Proceedings of Machine Learning Research 208:59-62 Available from https://proceedings.mlr.press/v208/lesort23a.html.

Related Material