BOWL: A Deceptively Simple Open World Learner

Roshni Ramanna Kamath, Rupert Mitchell, Subarnaduti Paul, Kristian Kersting, Martin Mundt
Proceedings of The 4th Conference on Lifelong Learning Agents, PMLR 330:358-375, 2026.

Abstract

Traditional machine learning excels on static benchmarks, but the real world is dynamic and seldom as carefully curated as test sets. Practical applications may generally encounter undesired inputs, are required to deal with novel information, and need to ensure operation through their full lifetime - aspects where standard deep models struggle. These three elements may have been researched individually, but their practical conjunction, i.e., open world learning, is much less consolidated. In this paper, we posit that neural networks already contain a powerful catalyst to turn them into open world learners: the batch normalization layer. Leveraging its tracked statistics, we derive effective strategies to detect in- and out-of-distribution samples, select informative data points, and update the model continuously. This, in turn, allows us to demonstrate that existing batch-normalized models can be made more robust, less prone to forgetting over time, and be trained efficiently with less data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v330-kamath26a, title = {BOWL: A Deceptively Simple Open World Learner}, author = {Kamath, Roshni Ramanna and Mitchell, Rupert and Paul, Subarnaduti and Kersting, Kristian and Mundt, Martin}, booktitle = {Proceedings of The 4th Conference on Lifelong Learning Agents}, pages = {358--375}, year = {2026}, editor = {Chandar, Sarath and Pascanu, Razvan and Eaton, Eric and Liu, Bing and Mahmood, Rupam and Rannen-Triki, Amal}, volume = {330}, series = {Proceedings of Machine Learning Research}, month = {11--14 Aug}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v330/main/assets/kamath26a/kamath26a.pdf}, url = {https://proceedings.mlr.press/v330/kamath26a.html}, abstract = {Traditional machine learning excels on static benchmarks, but the real world is dynamic and seldom as carefully curated as test sets. Practical applications may generally encounter undesired inputs, are required to deal with novel information, and need to ensure operation through their full lifetime - aspects where standard deep models struggle. These three elements may have been researched individually, but their practical conjunction, i.e., open world learning, is much less consolidated. In this paper, we posit that neural networks already contain a powerful catalyst to turn them into open world learners: the batch normalization layer. Leveraging its tracked statistics, we derive effective strategies to detect in- and out-of-distribution samples, select informative data points, and update the model continuously. This, in turn, allows us to demonstrate that existing batch-normalized models can be made more robust, less prone to forgetting over time, and be trained efficiently with less data.} }
Endnote
%0 Conference Paper %T BOWL: A Deceptively Simple Open World Learner %A Roshni Ramanna Kamath %A Rupert Mitchell %A Subarnaduti Paul %A Kristian Kersting %A Martin Mundt %B Proceedings of The 4th Conference on Lifelong Learning Agents %C Proceedings of Machine Learning Research %D 2026 %E Sarath Chandar %E Razvan Pascanu %E Eric Eaton %E Bing Liu %E Rupam Mahmood %E Amal Rannen-Triki %F pmlr-v330-kamath26a %I PMLR %P 358--375 %U https://proceedings.mlr.press/v330/kamath26a.html %V 330 %X Traditional machine learning excels on static benchmarks, but the real world is dynamic and seldom as carefully curated as test sets. Practical applications may generally encounter undesired inputs, are required to deal with novel information, and need to ensure operation through their full lifetime - aspects where standard deep models struggle. These three elements may have been researched individually, but their practical conjunction, i.e., open world learning, is much less consolidated. In this paper, we posit that neural networks already contain a powerful catalyst to turn them into open world learners: the batch normalization layer. Leveraging its tracked statistics, we derive effective strategies to detect in- and out-of-distribution samples, select informative data points, and update the model continuously. This, in turn, allows us to demonstrate that existing batch-normalized models can be made more robust, less prone to forgetting over time, and be trained efficiently with less data.
APA
Kamath, R.R., Mitchell, R., Paul, S., Kersting, K. & Mundt, M.. (2026). BOWL: A Deceptively Simple Open World Learner. Proceedings of The 4th Conference on Lifelong Learning Agents, in Proceedings of Machine Learning Research 330:358-375 Available from https://proceedings.mlr.press/v330/kamath26a.html.

Related Material