Using Submodular Optimization to Approximate Minimum-Size Abductive Path Explanations for Tree-Based Models

Louenas Bounia
Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, PMLR 286:388-397, 2025.

Abstract

One of the key challenges of Explainable Artificial Intelligence (XAI) is providing concise and understandable explanations for classification model predictions. An abductive explanation for a given instance is a minimal set of features that justify the prediction. These minimal explanations are valuable for their interpretability, as they eliminate redundant or irrelevant information. However, computing these explanations is difficult, even for simpler classifiers like decision trees. Finding a minimum-size abductive explanation in decision trees is an NP-complete problem, and this complexity extends to random forests for minimum-size majoritary reasons. In this work, we focus on finding minimal sets of features along the paths leading to the decision, called path-abductive explanations. We show that the problem of finding minimum-size path-abductive explanations in decision trees and minimum-size path-majoritary reasons in random forests is also NP-complete. To address this, we reformulate the problem as a submodular optimization task and propose a greedy algorithm with optimality guarantees. Our experiments demonstrate that this algorithm produces near-optimal explanations efficiently and offers a strong alternative for difficult instances, where exact methods based on SAT encodings are computationally expensive. This approach is especially useful in resource-limited environments where modern SAT solvers are not feasible.

Cite this Paper


BibTeX
@InProceedings{pmlr-v286-bounia25a, title = {Using Submodular Optimization to Approximate Minimum-Size Abductive Path Explanations for Tree-Based Models}, author = {Bounia, Louenas}, booktitle = {Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence}, pages = {388--397}, year = {2025}, editor = {Chiappa, Silvia and Magliacane, Sara}, volume = {286}, series = {Proceedings of Machine Learning Research}, month = {21--25 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v286/main/assets/bounia25a/bounia25a.pdf}, url = {https://proceedings.mlr.press/v286/bounia25a.html}, abstract = {One of the key challenges of Explainable Artificial Intelligence (XAI) is providing concise and understandable explanations for classification model predictions. An abductive explanation for a given instance is a minimal set of features that justify the prediction. These minimal explanations are valuable for their interpretability, as they eliminate redundant or irrelevant information. However, computing these explanations is difficult, even for simpler classifiers like decision trees. Finding a minimum-size abductive explanation in decision trees is an NP-complete problem, and this complexity extends to random forests for minimum-size majoritary reasons. In this work, we focus on finding minimal sets of features along the paths leading to the decision, called path-abductive explanations. We show that the problem of finding minimum-size path-abductive explanations in decision trees and minimum-size path-majoritary reasons in random forests is also NP-complete. To address this, we reformulate the problem as a submodular optimization task and propose a greedy algorithm with optimality guarantees. Our experiments demonstrate that this algorithm produces near-optimal explanations efficiently and offers a strong alternative for difficult instances, where exact methods based on SAT encodings are computationally expensive. This approach is especially useful in resource-limited environments where modern SAT solvers are not feasible.} }
Endnote
%0 Conference Paper %T Using Submodular Optimization to Approximate Minimum-Size Abductive Path Explanations for Tree-Based Models %A Louenas Bounia %B Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2025 %E Silvia Chiappa %E Sara Magliacane %F pmlr-v286-bounia25a %I PMLR %P 388--397 %U https://proceedings.mlr.press/v286/bounia25a.html %V 286 %X One of the key challenges of Explainable Artificial Intelligence (XAI) is providing concise and understandable explanations for classification model predictions. An abductive explanation for a given instance is a minimal set of features that justify the prediction. These minimal explanations are valuable for their interpretability, as they eliminate redundant or irrelevant information. However, computing these explanations is difficult, even for simpler classifiers like decision trees. Finding a minimum-size abductive explanation in decision trees is an NP-complete problem, and this complexity extends to random forests for minimum-size majoritary reasons. In this work, we focus on finding minimal sets of features along the paths leading to the decision, called path-abductive explanations. We show that the problem of finding minimum-size path-abductive explanations in decision trees and minimum-size path-majoritary reasons in random forests is also NP-complete. To address this, we reformulate the problem as a submodular optimization task and propose a greedy algorithm with optimality guarantees. Our experiments demonstrate that this algorithm produces near-optimal explanations efficiently and offers a strong alternative for difficult instances, where exact methods based on SAT encodings are computationally expensive. This approach is especially useful in resource-limited environments where modern SAT solvers are not feasible.
APA
Bounia, L.. (2025). Using Submodular Optimization to Approximate Minimum-Size Abductive Path Explanations for Tree-Based Models. Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 286:388-397 Available from https://proceedings.mlr.press/v286/bounia25a.html.

Related Material