Scalable Bayesian Rule Lists

Hongyu Yang, Cynthia Rudin, Margo Seltzer
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:3921-3930, 2017.

Abstract

We present an algorithm for building probabilistic rule lists that is two orders of magnitude faster than previous work. Rule list algorithms are competitors for decision tree algorithms. They are associative classifiers, in that they are built from pre-mined association rules. They have a logical structure that is a sequence of IF-THEN rules, identical to a decision list or one-sided decision tree. Instead of using greedy splitting and pruning like decision tree algorithms, we aim to fully optimize over rule lists, striking a practical balance between accuracy, interpretability, and computational speed. The algorithm presented here uses a mixture of theoretical bounds (tight enough to have practical implications as a screening or bounding procedure), computational reuse, and highly tuned language libraries to achieve computational efficiency. Currently, for many practical problems, this method achieves better accuracy and sparsity than decision trees. In many cases, the computational time is practical and often less than that of decision trees.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-yang17h, title = {Scalable {B}ayesian Rule Lists}, author = {Hongyu Yang and Cynthia Rudin and Margo Seltzer}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {3921--3930}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/yang17h/yang17h.pdf}, url = {https://proceedings.mlr.press/v70/yang17h.html}, abstract = {We present an algorithm for building probabilistic rule lists that is two orders of magnitude faster than previous work. Rule list algorithms are competitors for decision tree algorithms. They are associative classifiers, in that they are built from pre-mined association rules. They have a logical structure that is a sequence of IF-THEN rules, identical to a decision list or one-sided decision tree. Instead of using greedy splitting and pruning like decision tree algorithms, we aim to fully optimize over rule lists, striking a practical balance between accuracy, interpretability, and computational speed. The algorithm presented here uses a mixture of theoretical bounds (tight enough to have practical implications as a screening or bounding procedure), computational reuse, and highly tuned language libraries to achieve computational efficiency. Currently, for many practical problems, this method achieves better accuracy and sparsity than decision trees. In many cases, the computational time is practical and often less than that of decision trees.} }
Endnote
%0 Conference Paper %T Scalable Bayesian Rule Lists %A Hongyu Yang %A Cynthia Rudin %A Margo Seltzer %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-yang17h %I PMLR %P 3921--3930 %U https://proceedings.mlr.press/v70/yang17h.html %V 70 %X We present an algorithm for building probabilistic rule lists that is two orders of magnitude faster than previous work. Rule list algorithms are competitors for decision tree algorithms. They are associative classifiers, in that they are built from pre-mined association rules. They have a logical structure that is a sequence of IF-THEN rules, identical to a decision list or one-sided decision tree. Instead of using greedy splitting and pruning like decision tree algorithms, we aim to fully optimize over rule lists, striking a practical balance between accuracy, interpretability, and computational speed. The algorithm presented here uses a mixture of theoretical bounds (tight enough to have practical implications as a screening or bounding procedure), computational reuse, and highly tuned language libraries to achieve computational efficiency. Currently, for many practical problems, this method achieves better accuracy and sparsity than decision trees. In many cases, the computational time is practical and often less than that of decision trees.
APA
Yang, H., Rudin, C. & Seltzer, M.. (2017). Scalable Bayesian Rule Lists. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:3921-3930 Available from https://proceedings.mlr.press/v70/yang17h.html.

Related Material