Bayesian Decision Theory on Decision Trees: Uncertainty Evaluation and Interpretability

Yuta Nakahara, Shota Saito, Naoki Ichijo, Koki Kazama, Toshiyasu Matsushima
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1045-1053, 2025.

Abstract

Deterministic decision trees have difficulty in evaluating uncertainty especially for small samples. To solve this problem, we interpret the decision trees as stochastic models and consider prediction problems in the framework of Bayesian decision theory. Our models have three kinds of parameters: a tree shape, leaf parameters, and inner parameters. To make Bayesian optimal decisions, we have to calculate the posterior distribution of these parameters. Previously, two types of methods have been proposed. One marginalizes out the leaf parameters and samples the tree shape and the inner parameters by Metropolis-Hastings (MH) algorithms. The other marginalizes out both the leaf parameters and the tree shape based on a concept called meta-trees and approximates the posterior distribution for the inner parameters by a bagging-like method. In this paper, we propose a novel MH algorithm where the leaf parameters and the tree shape are marginalized out by using the meta-trees and only the inner parameters are sampled. Moreover, we update all the inner parameters simultaneously in each MH step. This algorithm accelerates the convergence and mixing of the Markov chain. We evaluate our algorithm on various benchmark datasets with other state-of-the-art methods. Further, our model provides a novel statistical evaluation of feature importance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-nakahara25a, title = {Bayesian Decision Theory on Decision Trees: Uncertainty Evaluation and Interpretability}, author = {Nakahara, Yuta and Saito, Shota and Ichijo, Naoki and Kazama, Koki and Matsushima, Toshiyasu}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1045--1053}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/nakahara25a/nakahara25a.pdf}, url = {https://proceedings.mlr.press/v258/nakahara25a.html}, abstract = {Deterministic decision trees have difficulty in evaluating uncertainty especially for small samples. To solve this problem, we interpret the decision trees as stochastic models and consider prediction problems in the framework of Bayesian decision theory. Our models have three kinds of parameters: a tree shape, leaf parameters, and inner parameters. To make Bayesian optimal decisions, we have to calculate the posterior distribution of these parameters. Previously, two types of methods have been proposed. One marginalizes out the leaf parameters and samples the tree shape and the inner parameters by Metropolis-Hastings (MH) algorithms. The other marginalizes out both the leaf parameters and the tree shape based on a concept called meta-trees and approximates the posterior distribution for the inner parameters by a bagging-like method. In this paper, we propose a novel MH algorithm where the leaf parameters and the tree shape are marginalized out by using the meta-trees and only the inner parameters are sampled. Moreover, we update all the inner parameters simultaneously in each MH step. This algorithm accelerates the convergence and mixing of the Markov chain. We evaluate our algorithm on various benchmark datasets with other state-of-the-art methods. Further, our model provides a novel statistical evaluation of feature importance.} }
Endnote
%0 Conference Paper %T Bayesian Decision Theory on Decision Trees: Uncertainty Evaluation and Interpretability %A Yuta Nakahara %A Shota Saito %A Naoki Ichijo %A Koki Kazama %A Toshiyasu Matsushima %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-nakahara25a %I PMLR %P 1045--1053 %U https://proceedings.mlr.press/v258/nakahara25a.html %V 258 %X Deterministic decision trees have difficulty in evaluating uncertainty especially for small samples. To solve this problem, we interpret the decision trees as stochastic models and consider prediction problems in the framework of Bayesian decision theory. Our models have three kinds of parameters: a tree shape, leaf parameters, and inner parameters. To make Bayesian optimal decisions, we have to calculate the posterior distribution of these parameters. Previously, two types of methods have been proposed. One marginalizes out the leaf parameters and samples the tree shape and the inner parameters by Metropolis-Hastings (MH) algorithms. The other marginalizes out both the leaf parameters and the tree shape based on a concept called meta-trees and approximates the posterior distribution for the inner parameters by a bagging-like method. In this paper, we propose a novel MH algorithm where the leaf parameters and the tree shape are marginalized out by using the meta-trees and only the inner parameters are sampled. Moreover, we update all the inner parameters simultaneously in each MH step. This algorithm accelerates the convergence and mixing of the Markov chain. We evaluate our algorithm on various benchmark datasets with other state-of-the-art methods. Further, our model provides a novel statistical evaluation of feature importance.
APA
Nakahara, Y., Saito, S., Ichijo, N., Kazama, K. & Matsushima, T.. (2025). Bayesian Decision Theory on Decision Trees: Uncertainty Evaluation and Interpretability. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1045-1053 Available from https://proceedings.mlr.press/v258/nakahara25a.html.

Related Material