DALE: Differential Accumulated Local Effects for efficient and accurate global explanations

Vasilis Gkolemis, Theodore Dalamagas, Christos Diou
Proceedings of The 14th Asian Conference on Machine Learning, PMLR 189:375-390, 2023.

Abstract

Accumulated Local Effect (ALE) is a method for accurately estimating feature effects, overcoming fundamental failure modes of previously-existed methods, such as Partial Dependence Plots. However, \textit{ALE’s approximation}, i.e. the method for estimating ALE from the limited samples of the training set, faces two weaknesses. First, it does not scale well in cases where the input has high dimensionality, and, second, it is vulnerable to out-of-distribution (OOD) sampling when the training set is relatively small. In this paper, we propose a novel ALE approximation, called Differential Accumulated Local Effects (DALE), which can be used in cases where the ML model is differentiable and an auto-differentiable framework is accessible. Our proposal has significant computational advantages, making feature effect estimation applicable to high-dimensional Machine Learning scenarios with near-zero computational overhead. Furthermore, DALE does not create artificial points for calculating the feature effect, resolving misleading estimations due to OOD sampling. Finally, we formally prove that, under some hypotheses, DALE is an unbiased estimator of ALE and we present a method for quantifying the standard error of the explanation. Experiments using both synthetic and real datasets demonstrate the value of the proposed approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v189-gkolemis23a, title = {DALE: Differential Accumulated Local Effects for efficient and accurate global explanations}, author = {Gkolemis, Vasilis and Dalamagas, Theodore and Diou, Christos}, booktitle = {Proceedings of The 14th Asian Conference on Machine Learning}, pages = {375--390}, year = {2023}, editor = {Khan, Emtiyaz and Gonen, Mehmet}, volume = {189}, series = {Proceedings of Machine Learning Research}, month = {12--14 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v189/gkolemis23a/gkolemis23a.pdf}, url = {https://proceedings.mlr.press/v189/gkolemis23a.html}, abstract = {Accumulated Local Effect (ALE) is a method for accurately estimating feature effects, overcoming fundamental failure modes of previously-existed methods, such as Partial Dependence Plots. However, \textit{ALE’s approximation}, i.e. the method for estimating ALE from the limited samples of the training set, faces two weaknesses. First, it does not scale well in cases where the input has high dimensionality, and, second, it is vulnerable to out-of-distribution (OOD) sampling when the training set is relatively small. In this paper, we propose a novel ALE approximation, called Differential Accumulated Local Effects (DALE), which can be used in cases where the ML model is differentiable and an auto-differentiable framework is accessible. Our proposal has significant computational advantages, making feature effect estimation applicable to high-dimensional Machine Learning scenarios with near-zero computational overhead. Furthermore, DALE does not create artificial points for calculating the feature effect, resolving misleading estimations due to OOD sampling. Finally, we formally prove that, under some hypotheses, DALE is an unbiased estimator of ALE and we present a method for quantifying the standard error of the explanation. Experiments using both synthetic and real datasets demonstrate the value of the proposed approach.} }
Endnote
%0 Conference Paper %T DALE: Differential Accumulated Local Effects for efficient and accurate global explanations %A Vasilis Gkolemis %A Theodore Dalamagas %A Christos Diou %B Proceedings of The 14th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Emtiyaz Khan %E Mehmet Gonen %F pmlr-v189-gkolemis23a %I PMLR %P 375--390 %U https://proceedings.mlr.press/v189/gkolemis23a.html %V 189 %X Accumulated Local Effect (ALE) is a method for accurately estimating feature effects, overcoming fundamental failure modes of previously-existed methods, such as Partial Dependence Plots. However, \textit{ALE’s approximation}, i.e. the method for estimating ALE from the limited samples of the training set, faces two weaknesses. First, it does not scale well in cases where the input has high dimensionality, and, second, it is vulnerable to out-of-distribution (OOD) sampling when the training set is relatively small. In this paper, we propose a novel ALE approximation, called Differential Accumulated Local Effects (DALE), which can be used in cases where the ML model is differentiable and an auto-differentiable framework is accessible. Our proposal has significant computational advantages, making feature effect estimation applicable to high-dimensional Machine Learning scenarios with near-zero computational overhead. Furthermore, DALE does not create artificial points for calculating the feature effect, resolving misleading estimations due to OOD sampling. Finally, we formally prove that, under some hypotheses, DALE is an unbiased estimator of ALE and we present a method for quantifying the standard error of the explanation. Experiments using both synthetic and real datasets demonstrate the value of the proposed approach.
APA
Gkolemis, V., Dalamagas, T. & Diou, C.. (2023). DALE: Differential Accumulated Local Effects for efficient and accurate global explanations. Proceedings of The 14th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 189:375-390 Available from https://proceedings.mlr.press/v189/gkolemis23a.html.

Related Material