Training OOD Detectors in their Natural Habitats

Julian Katz-Samuels, Julia B Nakhleh, Robert Nowak, Yixuan Li
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:10848-10865, 2022.

Abstract

Out-of-distribution (OOD) detection is important for machine learning models deployed in the wild. Recent methods use auxiliary outlier data to regularize the model for improved OOD detection. However, these approaches make a strong distributional assumption that the auxiliary outlier data is completely separable from the in-distribution (ID) data. In this paper, we propose a novel framework that leverages wild mixture data—that naturally consists of both ID and OOD samples. Such wild data is abundant and arises freely upon deploying a machine learning classifier in their natural habitats. Our key idea is to formulate a constrained optimization problem and to show how to tractably solve it. Our learning objective maximizes the OOD detection rate, subject to constraints on the classification error of ID data and on the OOD error rate of ID examples. We extensively evaluate our approach on common OOD detection tasks and demonstrate superior performance. Code is available at https://github.com/jkatzsam/woods_ood.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-katz-samuels22a, title = {Training {OOD} Detectors in their Natural Habitats}, author = {Katz-Samuels, Julian and Nakhleh, Julia B and Nowak, Robert and Li, Yixuan}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {10848--10865}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/katz-samuels22a/katz-samuels22a.pdf}, url = {https://proceedings.mlr.press/v162/katz-samuels22a.html}, abstract = {Out-of-distribution (OOD) detection is important for machine learning models deployed in the wild. Recent methods use auxiliary outlier data to regularize the model for improved OOD detection. However, these approaches make a strong distributional assumption that the auxiliary outlier data is completely separable from the in-distribution (ID) data. In this paper, we propose a novel framework that leverages wild mixture data—that naturally consists of both ID and OOD samples. Such wild data is abundant and arises freely upon deploying a machine learning classifier in their natural habitats. Our key idea is to formulate a constrained optimization problem and to show how to tractably solve it. Our learning objective maximizes the OOD detection rate, subject to constraints on the classification error of ID data and on the OOD error rate of ID examples. We extensively evaluate our approach on common OOD detection tasks and demonstrate superior performance. Code is available at https://github.com/jkatzsam/woods_ood.} }
Endnote
%0 Conference Paper %T Training OOD Detectors in their Natural Habitats %A Julian Katz-Samuels %A Julia B Nakhleh %A Robert Nowak %A Yixuan Li %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-katz-samuels22a %I PMLR %P 10848--10865 %U https://proceedings.mlr.press/v162/katz-samuels22a.html %V 162 %X Out-of-distribution (OOD) detection is important for machine learning models deployed in the wild. Recent methods use auxiliary outlier data to regularize the model for improved OOD detection. However, these approaches make a strong distributional assumption that the auxiliary outlier data is completely separable from the in-distribution (ID) data. In this paper, we propose a novel framework that leverages wild mixture data—that naturally consists of both ID and OOD samples. Such wild data is abundant and arises freely upon deploying a machine learning classifier in their natural habitats. Our key idea is to formulate a constrained optimization problem and to show how to tractably solve it. Our learning objective maximizes the OOD detection rate, subject to constraints on the classification error of ID data and on the OOD error rate of ID examples. We extensively evaluate our approach on common OOD detection tasks and demonstrate superior performance. Code is available at https://github.com/jkatzsam/woods_ood.
APA
Katz-Samuels, J., Nakhleh, J.B., Nowak, R. & Li, Y.. (2022). Training OOD Detectors in their Natural Habitats. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:10848-10865 Available from https://proceedings.mlr.press/v162/katz-samuels22a.html.

Related Material