Smaller, more accurate regression forests using tree alternating optimization

Arman Zharmagambetov, Miguel Carreira-Perpinan
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:11398-11408, 2020.

Abstract

Regression forests, based on ensemble approaches such as bagging or boosting, have long been recognized as the leading off-the-shelf method for regression. However, forests rely on a greedy top-down procedure such as CART to learn each tree. We extend a recent algorithm for learning classification trees, Tree Alternating Optimization (TAO), to the regression case, and use it with bagging to construct regression forests of oblique trees, having hyperplane splits at the decision nodes. In a wide range of datasets, we show that the resulting forests exceed the accuracy of state-of-the-art algorithms such as random forests, AdaBoost or gradient boosting, often considerably, while yielding forests that have usually fewer and shallower trees and hence fewer parameters and faster inference overall. This result has an immense practical impact and advocates for the power of optimization in ensemble learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-zharmagambetov20a, title = {Smaller, more accurate regression forests using tree alternating optimization}, author = {Zharmagambetov, Arman and Carreira-Perpinan, Miguel}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {11398--11408}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/zharmagambetov20a/zharmagambetov20a.pdf}, url = {https://proceedings.mlr.press/v119/zharmagambetov20a.html}, abstract = {Regression forests, based on ensemble approaches such as bagging or boosting, have long been recognized as the leading off-the-shelf method for regression. However, forests rely on a greedy top-down procedure such as CART to learn each tree. We extend a recent algorithm for learning classification trees, Tree Alternating Optimization (TAO), to the regression case, and use it with bagging to construct regression forests of oblique trees, having hyperplane splits at the decision nodes. In a wide range of datasets, we show that the resulting forests exceed the accuracy of state-of-the-art algorithms such as random forests, AdaBoost or gradient boosting, often considerably, while yielding forests that have usually fewer and shallower trees and hence fewer parameters and faster inference overall. This result has an immense practical impact and advocates for the power of optimization in ensemble learning.} }
Endnote
%0 Conference Paper %T Smaller, more accurate regression forests using tree alternating optimization %A Arman Zharmagambetov %A Miguel Carreira-Perpinan %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-zharmagambetov20a %I PMLR %P 11398--11408 %U https://proceedings.mlr.press/v119/zharmagambetov20a.html %V 119 %X Regression forests, based on ensemble approaches such as bagging or boosting, have long been recognized as the leading off-the-shelf method for regression. However, forests rely on a greedy top-down procedure such as CART to learn each tree. We extend a recent algorithm for learning classification trees, Tree Alternating Optimization (TAO), to the regression case, and use it with bagging to construct regression forests of oblique trees, having hyperplane splits at the decision nodes. In a wide range of datasets, we show that the resulting forests exceed the accuracy of state-of-the-art algorithms such as random forests, AdaBoost or gradient boosting, often considerably, while yielding forests that have usually fewer and shallower trees and hence fewer parameters and faster inference overall. This result has an immense practical impact and advocates for the power of optimization in ensemble learning.
APA
Zharmagambetov, A. & Carreira-Perpinan, M.. (2020). Smaller, more accurate regression forests using tree alternating optimization. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:11398-11408 Available from https://proceedings.mlr.press/v119/zharmagambetov20a.html.

Related Material