Sparse Feature Selection in Kernel Discriminant Analysis via Optimal Scoring

Alexander F. Lapanowski, Irina Gaynanova
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:1704-1713, 2019.

Abstract

We consider the two-group classification problem and propose a kernel classifier based on the optimal scoring framework. Unlike previous approaches, we provide theoretical guarantees on the expected risk consistency of the method. We also allow for feature selection by imposing structured sparsity using weighted kernels. We propose fully-automated methods for selection of all tuning parameters, and in particular adapt kernel shrinkage ideas for ridge parameter selection. Numerical studies demonstrate the superior classification performance of the proposed approach compared to existing nonparametric classifiers.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-lapanowski19a, title = {Sparse Feature Selection in Kernel Discriminant Analysis via Optimal Scoring}, author = {Lapanowski, Alexander F. and Gaynanova, Irina}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {1704--1713}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/lapanowski19a/lapanowski19a.pdf}, url = {https://proceedings.mlr.press/v89/lapanowski19a.html}, abstract = {We consider the two-group classification problem and propose a kernel classifier based on the optimal scoring framework. Unlike previous approaches, we provide theoretical guarantees on the expected risk consistency of the method. We also allow for feature selection by imposing structured sparsity using weighted kernels. We propose fully-automated methods for selection of all tuning parameters, and in particular adapt kernel shrinkage ideas for ridge parameter selection. Numerical studies demonstrate the superior classification performance of the proposed approach compared to existing nonparametric classifiers.} }
Endnote
%0 Conference Paper %T Sparse Feature Selection in Kernel Discriminant Analysis via Optimal Scoring %A Alexander F. Lapanowski %A Irina Gaynanova %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-lapanowski19a %I PMLR %P 1704--1713 %U https://proceedings.mlr.press/v89/lapanowski19a.html %V 89 %X We consider the two-group classification problem and propose a kernel classifier based on the optimal scoring framework. Unlike previous approaches, we provide theoretical guarantees on the expected risk consistency of the method. We also allow for feature selection by imposing structured sparsity using weighted kernels. We propose fully-automated methods for selection of all tuning parameters, and in particular adapt kernel shrinkage ideas for ridge parameter selection. Numerical studies demonstrate the superior classification performance of the proposed approach compared to existing nonparametric classifiers.
APA
Lapanowski, A.F. & Gaynanova, I.. (2019). Sparse Feature Selection in Kernel Discriminant Analysis via Optimal Scoring. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:1704-1713 Available from https://proceedings.mlr.press/v89/lapanowski19a.html.

Related Material