Fast Provably Robust Decision Trees and Boosting

Jun-Qi Guo, Ming-Zhuo Teng, Wei Gao, Zhi-Hua Zhou
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:8127-8144, 2022.

Abstract

Learning with adversarial robustness has been a challenge in contemporary machine learning, and recent years have witnessed increasing attention on robust decision trees and ensembles, mostly working with high computational complexity or without guarantees of provable robustness. This work proposes the Fast Provably Robust Decision Tree (FPRDT) with the smallest computational complexity O(n log n), a tradeoff between global and local optimizations over the adversarial 0/1 loss. We further develop the Provably Robust AdaBoost (PRAdaBoost) according to our robust decision trees, and present convergence analysis for training adversarial 0/1 loss. We conduct extensive experiments to support our approaches; in particular, our approaches are superior to those unprovably robust methods, and achieve better or comparable performance to those provably robust methods yet with the smallest running time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-guo22h, title = {Fast Provably Robust Decision Trees and Boosting}, author = {Guo, Jun-Qi and Teng, Ming-Zhuo and Gao, Wei and Zhou, Zhi-Hua}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {8127--8144}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/guo22h/guo22h.pdf}, url = {https://proceedings.mlr.press/v162/guo22h.html}, abstract = {Learning with adversarial robustness has been a challenge in contemporary machine learning, and recent years have witnessed increasing attention on robust decision trees and ensembles, mostly working with high computational complexity or without guarantees of provable robustness. This work proposes the Fast Provably Robust Decision Tree (FPRDT) with the smallest computational complexity O(n log n), a tradeoff between global and local optimizations over the adversarial 0/1 loss. We further develop the Provably Robust AdaBoost (PRAdaBoost) according to our robust decision trees, and present convergence analysis for training adversarial 0/1 loss. We conduct extensive experiments to support our approaches; in particular, our approaches are superior to those unprovably robust methods, and achieve better or comparable performance to those provably robust methods yet with the smallest running time.} }
Endnote
%0 Conference Paper %T Fast Provably Robust Decision Trees and Boosting %A Jun-Qi Guo %A Ming-Zhuo Teng %A Wei Gao %A Zhi-Hua Zhou %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-guo22h %I PMLR %P 8127--8144 %U https://proceedings.mlr.press/v162/guo22h.html %V 162 %X Learning with adversarial robustness has been a challenge in contemporary machine learning, and recent years have witnessed increasing attention on robust decision trees and ensembles, mostly working with high computational complexity or without guarantees of provable robustness. This work proposes the Fast Provably Robust Decision Tree (FPRDT) with the smallest computational complexity O(n log n), a tradeoff between global and local optimizations over the adversarial 0/1 loss. We further develop the Provably Robust AdaBoost (PRAdaBoost) according to our robust decision trees, and present convergence analysis for training adversarial 0/1 loss. We conduct extensive experiments to support our approaches; in particular, our approaches are superior to those unprovably robust methods, and achieve better or comparable performance to those provably robust methods yet with the smallest running time.
APA
Guo, J., Teng, M., Gao, W. & Zhou, Z.. (2022). Fast Provably Robust Decision Trees and Boosting. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:8127-8144 Available from https://proceedings.mlr.press/v162/guo22h.html.

Related Material