Hidden in Plain Sight: Subgroup Shifts Escape OOD Detection

Lisa M Koch, Christian M Schürch, Arthur Gretton, Philipp Berens
Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, PMLR 172:726-740, 2022.

Abstract

The safe application of machine learning systems in healthcare relies on valid performance claims. Such claims are typically established in a clinical validation setting designed to be as close as possible to the intended use, but inadvertent domain or population shifts remain a fundamental problem. In particular, subgroups may be differently represented in the data distribution in the validation compared to the application setting. For example, algorithms trained on population cohort data spanning all age groups may be predominantly applied in elderly people. While these data are not “out-of distribution”, changes in the prevalence of different subgroups may have considerable impact on algorithm performance or will at least render original performance claims invalid. Both are serious problems for safely deploying machine learning systems. In this paper, we demonstrate the fundamental limitations of individual example out-of-distribution detection for such scenarios, and show that subgroup shifts can be detected on a population-level instead. We formulate population-level shift detection in the framework of statistical hypothesis testing and show that recent state-of-the-art statistical tests can be effectively applied to subgroup shift detection in a synthetic scenario as well as real histopathology images.

Cite this Paper


BibTeX
@InProceedings{pmlr-v172-koch22a, title = {Hidden in Plain Sight: Subgroup Shifts Escape OOD Detection}, author = {Koch, Lisa M and Sch{\"u}rch, Christian M and Gretton, Arthur and Berens, Philipp}, booktitle = {Proceedings of The 5th International Conference on Medical Imaging with Deep Learning}, pages = {726--740}, year = {2022}, editor = {Konukoglu, Ender and Menze, Bjoern and Venkataraman, Archana and Baumgartner, Christian and Dou, Qi and Albarqouni, Shadi}, volume = {172}, series = {Proceedings of Machine Learning Research}, month = {06--08 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v172/koch22a/koch22a.pdf}, url = {https://proceedings.mlr.press/v172/koch22a.html}, abstract = {The safe application of machine learning systems in healthcare relies on valid performance claims. Such claims are typically established in a clinical validation setting designed to be as close as possible to the intended use, but inadvertent domain or population shifts remain a fundamental problem. In particular, subgroups may be differently represented in the data distribution in the validation compared to the application setting. For example, algorithms trained on population cohort data spanning all age groups may be predominantly applied in elderly people. While these data are not “out-of distribution”, changes in the prevalence of different subgroups may have considerable impact on algorithm performance or will at least render original performance claims invalid. Both are serious problems for safely deploying machine learning systems. In this paper, we demonstrate the fundamental limitations of individual example out-of-distribution detection for such scenarios, and show that subgroup shifts can be detected on a population-level instead. We formulate population-level shift detection in the framework of statistical hypothesis testing and show that recent state-of-the-art statistical tests can be effectively applied to subgroup shift detection in a synthetic scenario as well as real histopathology images.} }
Endnote
%0 Conference Paper %T Hidden in Plain Sight: Subgroup Shifts Escape OOD Detection %A Lisa M Koch %A Christian M Schürch %A Arthur Gretton %A Philipp Berens %B Proceedings of The 5th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2022 %E Ender Konukoglu %E Bjoern Menze %E Archana Venkataraman %E Christian Baumgartner %E Qi Dou %E Shadi Albarqouni %F pmlr-v172-koch22a %I PMLR %P 726--740 %U https://proceedings.mlr.press/v172/koch22a.html %V 172 %X The safe application of machine learning systems in healthcare relies on valid performance claims. Such claims are typically established in a clinical validation setting designed to be as close as possible to the intended use, but inadvertent domain or population shifts remain a fundamental problem. In particular, subgroups may be differently represented in the data distribution in the validation compared to the application setting. For example, algorithms trained on population cohort data spanning all age groups may be predominantly applied in elderly people. While these data are not “out-of distribution”, changes in the prevalence of different subgroups may have considerable impact on algorithm performance or will at least render original performance claims invalid. Both are serious problems for safely deploying machine learning systems. In this paper, we demonstrate the fundamental limitations of individual example out-of-distribution detection for such scenarios, and show that subgroup shifts can be detected on a population-level instead. We formulate population-level shift detection in the framework of statistical hypothesis testing and show that recent state-of-the-art statistical tests can be effectively applied to subgroup shift detection in a synthetic scenario as well as real histopathology images.
APA
Koch, L.M., Schürch, C.M., Gretton, A. & Berens, P.. (2022). Hidden in Plain Sight: Subgroup Shifts Escape OOD Detection. Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 172:726-740 Available from https://proceedings.mlr.press/v172/koch22a.html.

Related Material