Local Kernel Density Ratio-Based Feature Selection for Outlier Detection

Fatemeh Azmandian, Jennifer G. Dy, Javed A. Aslam, David R. Kaeli
; Proceedings of the Asian Conference on Machine Learning, PMLR 25:49-64, 2012.

Abstract

Selecting features is an important step of any machine learning task, though most of the focus has been to choose features relevant for classification and regression. In this work, we present a novel non-parametric evaluation criterion for filter-based feature selection which enhances outlier detection. Our proposed method seeks the subset of features that represents the inherent characteristics of the normal dataset while forcing outliers to stand out, making them more easily distinguished by outlier detection algorithms. Experimental results on real datasets show the advantage of this feature selection algorithm compared to popular and state-of-the-art methods. We also show that the proposed algorithm is able to overcome the small sample space problem and perform well on highly imbalanced datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v25-azmandian12, title = {Local Kernel Density Ratio-Based Feature Selection for Outlier Detection}, author = {Fatemeh Azmandian and Jennifer G. Dy and Javed A. Aslam and David R. Kaeli}, booktitle = {Proceedings of the Asian Conference on Machine Learning}, pages = {49--64}, year = {2012}, editor = {Steven C. H. Hoi and Wray Buntine}, volume = {25}, series = {Proceedings of Machine Learning Research}, address = {Singapore Management University, Singapore}, month = {04--06 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v25/azmandian12/azmandian12.pdf}, url = {http://proceedings.mlr.press/v25/azmandian12.html}, abstract = {Selecting features is an important step of any machine learning task, though most of the focus has been to choose features relevant for classification and regression. In this work, we present a novel non-parametric evaluation criterion for filter-based feature selection which enhances outlier detection. Our proposed method seeks the subset of features that represents the inherent characteristics of the normal dataset while forcing outliers to stand out, making them more easily distinguished by outlier detection algorithms. Experimental results on real datasets show the advantage of this feature selection algorithm compared to popular and state-of-the-art methods. We also show that the proposed algorithm is able to overcome the small sample space problem and perform well on highly imbalanced datasets.} }
Endnote
%0 Conference Paper %T Local Kernel Density Ratio-Based Feature Selection for Outlier Detection %A Fatemeh Azmandian %A Jennifer G. Dy %A Javed A. Aslam %A David R. Kaeli %B Proceedings of the Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2012 %E Steven C. H. Hoi %E Wray Buntine %F pmlr-v25-azmandian12 %I PMLR %J Proceedings of Machine Learning Research %P 49--64 %U http://proceedings.mlr.press %V 25 %W PMLR %X Selecting features is an important step of any machine learning task, though most of the focus has been to choose features relevant for classification and regression. In this work, we present a novel non-parametric evaluation criterion for filter-based feature selection which enhances outlier detection. Our proposed method seeks the subset of features that represents the inherent characteristics of the normal dataset while forcing outliers to stand out, making them more easily distinguished by outlier detection algorithms. Experimental results on real datasets show the advantage of this feature selection algorithm compared to popular and state-of-the-art methods. We also show that the proposed algorithm is able to overcome the small sample space problem and perform well on highly imbalanced datasets.
RIS
TY - CPAPER TI - Local Kernel Density Ratio-Based Feature Selection for Outlier Detection AU - Fatemeh Azmandian AU - Jennifer G. Dy AU - Javed A. Aslam AU - David R. Kaeli BT - Proceedings of the Asian Conference on Machine Learning PY - 2012/11/17 DA - 2012/11/17 ED - Steven C. H. Hoi ED - Wray Buntine ID - pmlr-v25-azmandian12 PB - PMLR SP - 49 DP - PMLR EP - 64 L1 - http://proceedings.mlr.press/v25/azmandian12/azmandian12.pdf UR - http://proceedings.mlr.press/v25/azmandian12.html AB - Selecting features is an important step of any machine learning task, though most of the focus has been to choose features relevant for classification and regression. In this work, we present a novel non-parametric evaluation criterion for filter-based feature selection which enhances outlier detection. Our proposed method seeks the subset of features that represents the inherent characteristics of the normal dataset while forcing outliers to stand out, making them more easily distinguished by outlier detection algorithms. Experimental results on real datasets show the advantage of this feature selection algorithm compared to popular and state-of-the-art methods. We also show that the proposed algorithm is able to overcome the small sample space problem and perform well on highly imbalanced datasets. ER -
APA
Azmandian, F., Dy, J.G., Aslam, J.A. & Kaeli, D.R.. (2012). Local Kernel Density Ratio-Based Feature Selection for Outlier Detection. Proceedings of the Asian Conference on Machine Learning, in PMLR 25:49-64

Related Material