Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy

Jean Honorio, Jaakkola Tommi
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):459-467, 2013.

Abstract

We provide a method that approximates the Bayes error rate and the Shannon entropy with high probability. The Bayes error rate approximation makes possible to build a classifier that polynomially approaches Bayes error rate. The Shannon entropy approximation provides provable performance guarantees for learning trees and Bayesian networks from continuous variables. Our results rely on some reasonable regularity conditions of the unknown probability distributions, and apply to bounded as well as unbounded variables.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-honorio13, title = {Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy}, author = {Honorio, Jean and Tommi, Jaakkola}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {459--467}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/honorio13.pdf}, url = {https://proceedings.mlr.press/v28/honorio13.html}, abstract = {We provide a method that approximates the Bayes error rate and the Shannon entropy with high probability. The Bayes error rate approximation makes possible to build a classifier that polynomially approaches Bayes error rate. The Shannon entropy approximation provides provable performance guarantees for learning trees and Bayesian networks from continuous variables. Our results rely on some reasonable regularity conditions of the unknown probability distributions, and apply to bounded as well as unbounded variables. } }
Endnote
%0 Conference Paper %T Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy %A Jean Honorio %A Jaakkola Tommi %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-honorio13 %I PMLR %P 459--467 %U https://proceedings.mlr.press/v28/honorio13.html %V 28 %N 3 %X We provide a method that approximates the Bayes error rate and the Shannon entropy with high probability. The Bayes error rate approximation makes possible to build a classifier that polynomially approaches Bayes error rate. The Shannon entropy approximation provides provable performance guarantees for learning trees and Bayesian networks from continuous variables. Our results rely on some reasonable regularity conditions of the unknown probability distributions, and apply to bounded as well as unbounded variables.
RIS
TY - CPAPER TI - Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy AU - Jean Honorio AU - Jaakkola Tommi BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-honorio13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 459 EP - 467 L1 - http://proceedings.mlr.press/v28/honorio13.pdf UR - https://proceedings.mlr.press/v28/honorio13.html AB - We provide a method that approximates the Bayes error rate and the Shannon entropy with high probability. The Bayes error rate approximation makes possible to build a classifier that polynomially approaches Bayes error rate. The Shannon entropy approximation provides provable performance guarantees for learning trees and Bayesian networks from continuous variables. Our results rely on some reasonable regularity conditions of the unknown probability distributions, and apply to bounded as well as unbounded variables. ER -
APA
Honorio, J. & Tommi, J.. (2013). Two-Sided Exponential Concentration Bounds for Bayes Error Rate and Shannon Entropy. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):459-467 Available from https://proceedings.mlr.press/v28/honorio13.html.

Related Material