Cautious Random Forests: a New Decision Strategy and some Experiments

Haifei Zhang, Benjamin Quost, Marie-Hélène Masson
Proceedings of the Twelveth International Symposium on Imprecise Probability: Theories and Applications, PMLR 147:369-372, 2021.

Abstract

Random forest is an accurate classification strategy, which estimates the posterior probabilities of the classes by averaging frequencies provided by trees. When data are scarce, this estimation becomes difficult. The Imprecise Dirichlet Model can be used to make the estimation robust, providing intervals of probabilities as outputs. Here, we propose a new aggregation strategy based on the theory of belief functions. We also propose to assign weights to the trees according to their amount of uncertainty when classifying a new instance. Our approach is compared experimentally to the baseline approach on several datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v147-zhang21a, title = {Cautious Random Forests: a New Decision Strategy and some Experiments}, author = {Zhang, Haifei and Quost, Benjamin and Masson, Marie-H\'el\`ene}, booktitle = {Proceedings of the Twelveth International Symposium on Imprecise Probability: Theories and Applications}, pages = {369--372}, year = {2021}, editor = {Cano, Andrés and De Bock, Jasper and Miranda, Enrique and Moral, Serafı́n}, volume = {147}, series = {Proceedings of Machine Learning Research}, month = {06--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v147/zhang21a/zhang21a.pdf}, url = {https://proceedings.mlr.press/v147/zhang21a.html}, abstract = {Random forest is an accurate classification strategy, which estimates the posterior probabilities of the classes by averaging frequencies provided by trees. When data are scarce, this estimation becomes difficult. The Imprecise Dirichlet Model can be used to make the estimation robust, providing intervals of probabilities as outputs. Here, we propose a new aggregation strategy based on the theory of belief functions. We also propose to assign weights to the trees according to their amount of uncertainty when classifying a new instance. Our approach is compared experimentally to the baseline approach on several datasets.} }
Endnote
%0 Conference Paper %T Cautious Random Forests: a New Decision Strategy and some Experiments %A Haifei Zhang %A Benjamin Quost %A Marie-Hélène Masson %B Proceedings of the Twelveth International Symposium on Imprecise Probability: Theories and Applications %C Proceedings of Machine Learning Research %D 2021 %E Andrés Cano %E Jasper De Bock %E Enrique Miranda %E Serafı́n Moral %F pmlr-v147-zhang21a %I PMLR %P 369--372 %U https://proceedings.mlr.press/v147/zhang21a.html %V 147 %X Random forest is an accurate classification strategy, which estimates the posterior probabilities of the classes by averaging frequencies provided by trees. When data are scarce, this estimation becomes difficult. The Imprecise Dirichlet Model can be used to make the estimation robust, providing intervals of probabilities as outputs. Here, we propose a new aggregation strategy based on the theory of belief functions. We also propose to assign weights to the trees according to their amount of uncertainty when classifying a new instance. Our approach is compared experimentally to the baseline approach on several datasets.
APA
Zhang, H., Quost, B. & Masson, M.. (2021). Cautious Random Forests: a New Decision Strategy and some Experiments. Proceedings of the Twelveth International Symposium on Imprecise Probability: Theories and Applications, in Proceedings of Machine Learning Research 147:369-372 Available from https://proceedings.mlr.press/v147/zhang21a.html.

Related Material