Neural Transformation Learning for Deep Anomaly Detection Beyond Images

Chen Qiu, Timo Pfrommer, Marius Kloft, Stephan Mandt, Maja Rudolph
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:8703-8714, 2021.

Abstract

Data transformations (e.g. rotations, reflections, and cropping) play an important role in self-supervised learning. Typically, images are transformed into different views, and neural networks trained on tasks involving these views produce useful feature representations for downstream tasks, including anomaly detection. However, for anomaly detection beyond image data, it is often unclear which transformations to use. Here we present a simple end-to-end procedure for anomaly detection with learnable transformations. The key idea is to embed the transformed data into a semantic space such that the transformed data still resemble their untransformed form, while different transformations are easily distinguishable. Extensive experiments on time series show that our proposed method outperforms existing approaches in the one-vs.-rest setting and is competitive in the more challenging n-vs.-rest anomaly-detection task. On medical and cyber-security tabular data, our method learns domain-specific transformations and detects anomalies more accurately than previous work.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-qiu21a, title = {Neural Transformation Learning for Deep Anomaly Detection Beyond Images}, author = {Qiu, Chen and Pfrommer, Timo and Kloft, Marius and Mandt, Stephan and Rudolph, Maja}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {8703--8714}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/qiu21a/qiu21a.pdf}, url = {https://proceedings.mlr.press/v139/qiu21a.html}, abstract = {Data transformations (e.g. rotations, reflections, and cropping) play an important role in self-supervised learning. Typically, images are transformed into different views, and neural networks trained on tasks involving these views produce useful feature representations for downstream tasks, including anomaly detection. However, for anomaly detection beyond image data, it is often unclear which transformations to use. Here we present a simple end-to-end procedure for anomaly detection with learnable transformations. The key idea is to embed the transformed data into a semantic space such that the transformed data still resemble their untransformed form, while different transformations are easily distinguishable. Extensive experiments on time series show that our proposed method outperforms existing approaches in the one-vs.-rest setting and is competitive in the more challenging n-vs.-rest anomaly-detection task. On medical and cyber-security tabular data, our method learns domain-specific transformations and detects anomalies more accurately than previous work.} }
Endnote
%0 Conference Paper %T Neural Transformation Learning for Deep Anomaly Detection Beyond Images %A Chen Qiu %A Timo Pfrommer %A Marius Kloft %A Stephan Mandt %A Maja Rudolph %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-qiu21a %I PMLR %P 8703--8714 %U https://proceedings.mlr.press/v139/qiu21a.html %V 139 %X Data transformations (e.g. rotations, reflections, and cropping) play an important role in self-supervised learning. Typically, images are transformed into different views, and neural networks trained on tasks involving these views produce useful feature representations for downstream tasks, including anomaly detection. However, for anomaly detection beyond image data, it is often unclear which transformations to use. Here we present a simple end-to-end procedure for anomaly detection with learnable transformations. The key idea is to embed the transformed data into a semantic space such that the transformed data still resemble their untransformed form, while different transformations are easily distinguishable. Extensive experiments on time series show that our proposed method outperforms existing approaches in the one-vs.-rest setting and is competitive in the more challenging n-vs.-rest anomaly-detection task. On medical and cyber-security tabular data, our method learns domain-specific transformations and detects anomalies more accurately than previous work.
APA
Qiu, C., Pfrommer, T., Kloft, M., Mandt, S. & Rudolph, M.. (2021). Neural Transformation Learning for Deep Anomaly Detection Beyond Images. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:8703-8714 Available from https://proceedings.mlr.press/v139/qiu21a.html.

Related Material