Efficient Training of Robust Decision Trees Against Adversarial Examples

Daniël Vos, Sicco Verwer
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:10586-10595, 2021.

Abstract

Current state-of-the-art algorithms for training robust decision trees have high runtime costs and require hours to run. We present GROOT, an efficient algorithm for training robust decision trees and random forests that runs in a matter of seconds to minutes. Where before the worst-case Gini impurity was computed iteratively, we find that we can solve this function analytically to improve time complexity from O(n) to O(1) in terms of n samples. Our results on both single trees and ensembles on 14 structured datasets as well as on MNIST and Fashion-MNIST demonstrate that GROOT runs several orders of magnitude faster than the state-of-the-art works and also shows better performance in terms of adversarial accuracy on structured data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-vos21a, title = {Efficient Training of Robust Decision Trees Against Adversarial Examples}, author = {Vos, Dani{\"e}l and Verwer, Sicco}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {10586--10595}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/vos21a/vos21a.pdf}, url = {https://proceedings.mlr.press/v139/vos21a.html}, abstract = {Current state-of-the-art algorithms for training robust decision trees have high runtime costs and require hours to run. We present GROOT, an efficient algorithm for training robust decision trees and random forests that runs in a matter of seconds to minutes. Where before the worst-case Gini impurity was computed iteratively, we find that we can solve this function analytically to improve time complexity from O(n) to O(1) in terms of n samples. Our results on both single trees and ensembles on 14 structured datasets as well as on MNIST and Fashion-MNIST demonstrate that GROOT runs several orders of magnitude faster than the state-of-the-art works and also shows better performance in terms of adversarial accuracy on structured data.} }
Endnote
%0 Conference Paper %T Efficient Training of Robust Decision Trees Against Adversarial Examples %A Daniël Vos %A Sicco Verwer %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-vos21a %I PMLR %P 10586--10595 %U https://proceedings.mlr.press/v139/vos21a.html %V 139 %X Current state-of-the-art algorithms for training robust decision trees have high runtime costs and require hours to run. We present GROOT, an efficient algorithm for training robust decision trees and random forests that runs in a matter of seconds to minutes. Where before the worst-case Gini impurity was computed iteratively, we find that we can solve this function analytically to improve time complexity from O(n) to O(1) in terms of n samples. Our results on both single trees and ensembles on 14 structured datasets as well as on MNIST and Fashion-MNIST demonstrate that GROOT runs several orders of magnitude faster than the state-of-the-art works and also shows better performance in terms of adversarial accuracy on structured data.
APA
Vos, D. & Verwer, S.. (2021). Efficient Training of Robust Decision Trees Against Adversarial Examples. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:10586-10595 Available from https://proceedings.mlr.press/v139/vos21a.html.

Related Material