Nonlinear Online Classification Algorithm with Probability Margin

Mingmin Chi, Huijun He, Wenqiang Zhang
Proceedings of the Asian Conference on Machine Learning, PMLR 20:33-46, 2011.

Abstract

Usually, it is necessary for nonlinear online learning algorithms to store a set of misclassified observed examples for computing kernel values. For large-scale problems, this is not only time consuming but leads also to an out-of-memory problem. In the paper, a nonlinear online classification algorithm is proposed with a probability margin to address the problem. In particular, the discriminant function is defined by the Gaussian mixture model with the statistical information of all the observed examples instead of data points. Then, the learnt model is used to train a nonlinear online classification algorithm with confidence such that the corresponding margin is defined by probability. When doing so, the internal memory is significantly reduced while the classification performance is kept. Also, we prove mistake bounds in terms of the generative model. Experiments carried out on one synthesis and two real large-scale data sets validate the effectiveness of the proposed approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v20-chi11, title = {Nonlinear Online Classification Algorithm with Probability Margin}, author = {Chi, Mingmin and He, Huijun and Zhang, Wenqiang}, booktitle = {Proceedings of the Asian Conference on Machine Learning}, pages = {33--46}, year = {2011}, editor = {Hsu, Chun-Nan and Lee, Wee Sun}, volume = {20}, series = {Proceedings of Machine Learning Research}, address = {South Garden Hotels and Resorts, Taoyuan, Taiwain}, month = {14--15 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v20/chi11/chi11.pdf}, url = {https://proceedings.mlr.press/v20/chi11.html}, abstract = {Usually, it is necessary for nonlinear online learning algorithms to store a set of misclassified observed examples for computing kernel values. For large-scale problems, this is not only time consuming but leads also to an out-of-memory problem. In the paper, a nonlinear online classification algorithm is proposed with a probability margin to address the problem. In particular, the discriminant function is defined by the Gaussian mixture model with the statistical information of all the observed examples instead of data points. Then, the learnt model is used to train a nonlinear online classification algorithm with confidence such that the corresponding margin is defined by probability. When doing so, the internal memory is significantly reduced while the classification performance is kept. Also, we prove mistake bounds in terms of the generative model. Experiments carried out on one synthesis and two real large-scale data sets validate the effectiveness of the proposed approach.} }
Endnote
%0 Conference Paper %T Nonlinear Online Classification Algorithm with Probability Margin %A Mingmin Chi %A Huijun He %A Wenqiang Zhang %B Proceedings of the Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2011 %E Chun-Nan Hsu %E Wee Sun Lee %F pmlr-v20-chi11 %I PMLR %P 33--46 %U https://proceedings.mlr.press/v20/chi11.html %V 20 %X Usually, it is necessary for nonlinear online learning algorithms to store a set of misclassified observed examples for computing kernel values. For large-scale problems, this is not only time consuming but leads also to an out-of-memory problem. In the paper, a nonlinear online classification algorithm is proposed with a probability margin to address the problem. In particular, the discriminant function is defined by the Gaussian mixture model with the statistical information of all the observed examples instead of data points. Then, the learnt model is used to train a nonlinear online classification algorithm with confidence such that the corresponding margin is defined by probability. When doing so, the internal memory is significantly reduced while the classification performance is kept. Also, we prove mistake bounds in terms of the generative model. Experiments carried out on one synthesis and two real large-scale data sets validate the effectiveness of the proposed approach.
RIS
TY - CPAPER TI - Nonlinear Online Classification Algorithm with Probability Margin AU - Mingmin Chi AU - Huijun He AU - Wenqiang Zhang BT - Proceedings of the Asian Conference on Machine Learning DA - 2011/11/17 ED - Chun-Nan Hsu ED - Wee Sun Lee ID - pmlr-v20-chi11 PB - PMLR DP - Proceedings of Machine Learning Research VL - 20 SP - 33 EP - 46 L1 - http://proceedings.mlr.press/v20/chi11/chi11.pdf UR - https://proceedings.mlr.press/v20/chi11.html AB - Usually, it is necessary for nonlinear online learning algorithms to store a set of misclassified observed examples for computing kernel values. For large-scale problems, this is not only time consuming but leads also to an out-of-memory problem. In the paper, a nonlinear online classification algorithm is proposed with a probability margin to address the problem. In particular, the discriminant function is defined by the Gaussian mixture model with the statistical information of all the observed examples instead of data points. Then, the learnt model is used to train a nonlinear online classification algorithm with confidence such that the corresponding margin is defined by probability. When doing so, the internal memory is significantly reduced while the classification performance is kept. Also, we prove mistake bounds in terms of the generative model. Experiments carried out on one synthesis and two real large-scale data sets validate the effectiveness of the proposed approach. ER -
APA
Chi, M., He, H. & Zhang, W.. (2011). Nonlinear Online Classification Algorithm with Probability Margin. Proceedings of the Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 20:33-46 Available from https://proceedings.mlr.press/v20/chi11.html.

Related Material