Confounder-Free Continual Learning via Recursive Feature Normalization

Yash Shah, Camila Gonzalez, Mohammad H. Abbasi, Qingyu Zhao, Kilian M. Pohl, Ehsan Adeli
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:54112-54142, 2025.

Abstract

Confounders are extraneous variables that affect both the input and the target, resulting in spurious correlations and biased predictions. There are recent advances in dealing with or removing confounders in traditional models, such as metadata normalization (MDN), where the distribution of the learned features is adjusted based on the study confounders. However, in the context of continual learning, where a model learns continuously from new data over time without forgetting, learning feature representations that are invariant to confounders remains a significant challenge. To remove their influence from intermediate feature representations, we introduce the Recursive MDN (R-MDN) layer, which can be integrated into any deep learning architecture, including vision transformers, and at any model stage. R-MDN performs statistical regression via the recursive least squares algorithm to maintain and continually update an internal model state with respect to changing distributions of data and confounding variables. Our experiments demonstrate that R-MDN promotes equitable predictions across population groups, both within static learning and across different stages of continual learning, by reducing catastrophic forgetting caused by confounder effects changing over time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-shah25a, title = {Confounder-Free Continual Learning via Recursive Feature Normalization}, author = {Shah, Yash and Gonzalez, Camila and Abbasi, Mohammad H. and Zhao, Qingyu and Pohl, Kilian M. and Adeli, Ehsan}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {54112--54142}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/shah25a/shah25a.pdf}, url = {https://proceedings.mlr.press/v267/shah25a.html}, abstract = {Confounders are extraneous variables that affect both the input and the target, resulting in spurious correlations and biased predictions. There are recent advances in dealing with or removing confounders in traditional models, such as metadata normalization (MDN), where the distribution of the learned features is adjusted based on the study confounders. However, in the context of continual learning, where a model learns continuously from new data over time without forgetting, learning feature representations that are invariant to confounders remains a significant challenge. To remove their influence from intermediate feature representations, we introduce the Recursive MDN (R-MDN) layer, which can be integrated into any deep learning architecture, including vision transformers, and at any model stage. R-MDN performs statistical regression via the recursive least squares algorithm to maintain and continually update an internal model state with respect to changing distributions of data and confounding variables. Our experiments demonstrate that R-MDN promotes equitable predictions across population groups, both within static learning and across different stages of continual learning, by reducing catastrophic forgetting caused by confounder effects changing over time.} }
Endnote
%0 Conference Paper %T Confounder-Free Continual Learning via Recursive Feature Normalization %A Yash Shah %A Camila Gonzalez %A Mohammad H. Abbasi %A Qingyu Zhao %A Kilian M. Pohl %A Ehsan Adeli %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-shah25a %I PMLR %P 54112--54142 %U https://proceedings.mlr.press/v267/shah25a.html %V 267 %X Confounders are extraneous variables that affect both the input and the target, resulting in spurious correlations and biased predictions. There are recent advances in dealing with or removing confounders in traditional models, such as metadata normalization (MDN), where the distribution of the learned features is adjusted based on the study confounders. However, in the context of continual learning, where a model learns continuously from new data over time without forgetting, learning feature representations that are invariant to confounders remains a significant challenge. To remove their influence from intermediate feature representations, we introduce the Recursive MDN (R-MDN) layer, which can be integrated into any deep learning architecture, including vision transformers, and at any model stage. R-MDN performs statistical regression via the recursive least squares algorithm to maintain and continually update an internal model state with respect to changing distributions of data and confounding variables. Our experiments demonstrate that R-MDN promotes equitable predictions across population groups, both within static learning and across different stages of continual learning, by reducing catastrophic forgetting caused by confounder effects changing over time.
APA
Shah, Y., Gonzalez, C., Abbasi, M.H., Zhao, Q., Pohl, K.M. & Adeli, E.. (2025). Confounder-Free Continual Learning via Recursive Feature Normalization. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:54112-54142 Available from https://proceedings.mlr.press/v267/shah25a.html.

Related Material