Non-asymptotic Analysis of Compressive Fisher Discriminants in terms of the Effective Dimension

Ata Kaban
Asian Conference on Machine Learning, PMLR 45:17-32, 2016.

Abstract

We provide a non-asymptotic analysis of the generalisation error of compressive Fisher linear discriminant (FLD) classification that is dimension free under mild assumptions. Our analysis includes the effects that random projection has on classification performance under covariance model misspecification, as well as various good and bad effects of random projections that contribute to the overall performance of compressive FLD. We also give an asymptotic bound as a corollary of our finite sample result. An important ingredient of our analysis is to develop new dimension-free bounds on the largest and smallest eigenvalue of the compressive covariance, which may be of independent interest.

Cite this Paper


BibTeX
@InProceedings{pmlr-v45-Kaban15a, title = {Non-asymptotic Analysis of Compressive Fisher Discriminants in terms of the Effective Dimension}, author = {Kaban, Ata}, booktitle = {Asian Conference on Machine Learning}, pages = {17--32}, year = {2016}, editor = {Holmes, Geoffrey and Liu, Tie-Yan}, volume = {45}, series = {Proceedings of Machine Learning Research}, address = {Hong Kong}, month = {20--22 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v45/Kaban15a.pdf}, url = {https://proceedings.mlr.press/v45/Kaban15a.html}, abstract = {We provide a non-asymptotic analysis of the generalisation error of compressive Fisher linear discriminant (FLD) classification that is dimension free under mild assumptions. Our analysis includes the effects that random projection has on classification performance under covariance model misspecification, as well as various good and bad effects of random projections that contribute to the overall performance of compressive FLD. We also give an asymptotic bound as a corollary of our finite sample result. An important ingredient of our analysis is to develop new dimension-free bounds on the largest and smallest eigenvalue of the compressive covariance, which may be of independent interest.} }
Endnote
%0 Conference Paper %T Non-asymptotic Analysis of Compressive Fisher Discriminants in terms of the Effective Dimension %A Ata Kaban %B Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Geoffrey Holmes %E Tie-Yan Liu %F pmlr-v45-Kaban15a %I PMLR %P 17--32 %U https://proceedings.mlr.press/v45/Kaban15a.html %V 45 %X We provide a non-asymptotic analysis of the generalisation error of compressive Fisher linear discriminant (FLD) classification that is dimension free under mild assumptions. Our analysis includes the effects that random projection has on classification performance under covariance model misspecification, as well as various good and bad effects of random projections that contribute to the overall performance of compressive FLD. We also give an asymptotic bound as a corollary of our finite sample result. An important ingredient of our analysis is to develop new dimension-free bounds on the largest and smallest eigenvalue of the compressive covariance, which may be of independent interest.
RIS
TY - CPAPER TI - Non-asymptotic Analysis of Compressive Fisher Discriminants in terms of the Effective Dimension AU - Ata Kaban BT - Asian Conference on Machine Learning DA - 2016/02/25 ED - Geoffrey Holmes ED - Tie-Yan Liu ID - pmlr-v45-Kaban15a PB - PMLR DP - Proceedings of Machine Learning Research VL - 45 SP - 17 EP - 32 L1 - http://proceedings.mlr.press/v45/Kaban15a.pdf UR - https://proceedings.mlr.press/v45/Kaban15a.html AB - We provide a non-asymptotic analysis of the generalisation error of compressive Fisher linear discriminant (FLD) classification that is dimension free under mild assumptions. Our analysis includes the effects that random projection has on classification performance under covariance model misspecification, as well as various good and bad effects of random projections that contribute to the overall performance of compressive FLD. We also give an asymptotic bound as a corollary of our finite sample result. An important ingredient of our analysis is to develop new dimension-free bounds on the largest and smallest eigenvalue of the compressive covariance, which may be of independent interest. ER -
APA
Kaban, A.. (2016). Non-asymptotic Analysis of Compressive Fisher Discriminants in terms of the Effective Dimension. Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 45:17-32 Available from https://proceedings.mlr.press/v45/Kaban15a.html.

Related Material