Unsupervised Novelty Detection in Pretrained Representation Space with Locally Adapted Likelihood Ratio

Amirhossein Ahmadian, Yifan Ding, Gabriel Eilertsen, Fredrik Lindsten
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:874-882, 2024.

Abstract

Detecting novelties given unlabeled examples of normal data is a challenging task in machine learning, particularly when the novel and normal categories are semantically close. Large deep models pretrained on massive datasets can provide a rich representation space in which the simple k-nearest neighbor distance works as a novelty measure. However, as we show in this paper, the basic k-NN method might be insufficient in this context due to ignoring the ’local geometry’ of the distribution over representations as well as the impact of irrelevant ’background features’. To address this, we propose a fully unsupervised novelty detection approach that integrates the flexibility of k-NN with a locally adapted scaling of dimensions based on the ’neighbors of nearest neighbor’ and computing a ’likelihood ratio’ in pretrained (self-supervised) representation spaces. Our experiments with image data show the advantage of this method when off-the-shelf vision transformers (e.g., pretrained by DINO) are used as the feature extractor without any fine-tuning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-ahmadian24a, title = { Unsupervised Novelty Detection in Pretrained Representation Space with Locally Adapted Likelihood Ratio }, author = {Ahmadian, Amirhossein and Ding, Yifan and Eilertsen, Gabriel and Lindsten, Fredrik}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {874--882}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/ahmadian24a/ahmadian24a.pdf}, url = {https://proceedings.mlr.press/v238/ahmadian24a.html}, abstract = { Detecting novelties given unlabeled examples of normal data is a challenging task in machine learning, particularly when the novel and normal categories are semantically close. Large deep models pretrained on massive datasets can provide a rich representation space in which the simple k-nearest neighbor distance works as a novelty measure. However, as we show in this paper, the basic k-NN method might be insufficient in this context due to ignoring the ’local geometry’ of the distribution over representations as well as the impact of irrelevant ’background features’. To address this, we propose a fully unsupervised novelty detection approach that integrates the flexibility of k-NN with a locally adapted scaling of dimensions based on the ’neighbors of nearest neighbor’ and computing a ’likelihood ratio’ in pretrained (self-supervised) representation spaces. Our experiments with image data show the advantage of this method when off-the-shelf vision transformers (e.g., pretrained by DINO) are used as the feature extractor without any fine-tuning. } }
Endnote
%0 Conference Paper %T Unsupervised Novelty Detection in Pretrained Representation Space with Locally Adapted Likelihood Ratio %A Amirhossein Ahmadian %A Yifan Ding %A Gabriel Eilertsen %A Fredrik Lindsten %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-ahmadian24a %I PMLR %P 874--882 %U https://proceedings.mlr.press/v238/ahmadian24a.html %V 238 %X Detecting novelties given unlabeled examples of normal data is a challenging task in machine learning, particularly when the novel and normal categories are semantically close. Large deep models pretrained on massive datasets can provide a rich representation space in which the simple k-nearest neighbor distance works as a novelty measure. However, as we show in this paper, the basic k-NN method might be insufficient in this context due to ignoring the ’local geometry’ of the distribution over representations as well as the impact of irrelevant ’background features’. To address this, we propose a fully unsupervised novelty detection approach that integrates the flexibility of k-NN with a locally adapted scaling of dimensions based on the ’neighbors of nearest neighbor’ and computing a ’likelihood ratio’ in pretrained (self-supervised) representation spaces. Our experiments with image data show the advantage of this method when off-the-shelf vision transformers (e.g., pretrained by DINO) are used as the feature extractor without any fine-tuning.
APA
Ahmadian, A., Ding, Y., Eilertsen, G. & Lindsten, F.. (2024). Unsupervised Novelty Detection in Pretrained Representation Space with Locally Adapted Likelihood Ratio . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:874-882 Available from https://proceedings.mlr.press/v238/ahmadian24a.html.

Related Material