Forest-type Regression with General Losses and Robust Forest

Alexander Hanbo Li, Andrew Martin
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2091-2100, 2017.

Abstract

This paper introduces a new general framework for forest-type regression which allows the development of robust forest regressors by selecting from a large family of robust loss functions. In particular, when plugged in the squared error and quantile losses, it will recover the classical random forest and quantile random forest. We then use robust loss functions to develop more robust forest-type regression algorithms. In the experiments, we show by simulation and real data that our robust forests are indeed much more insensitive to outliers, and choosing the right number of nearest neighbors can quickly improve the generalization performance of random forest.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-li17e, title = {Forest-type Regression with General Losses and Robust Forest}, author = {Alexander Hanbo Li and Andrew Martin}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {2091--2100}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/li17e/li17e.pdf}, url = {https://proceedings.mlr.press/v70/li17e.html}, abstract = {This paper introduces a new general framework for forest-type regression which allows the development of robust forest regressors by selecting from a large family of robust loss functions. In particular, when plugged in the squared error and quantile losses, it will recover the classical random forest and quantile random forest. We then use robust loss functions to develop more robust forest-type regression algorithms. In the experiments, we show by simulation and real data that our robust forests are indeed much more insensitive to outliers, and choosing the right number of nearest neighbors can quickly improve the generalization performance of random forest.} }
Endnote
%0 Conference Paper %T Forest-type Regression with General Losses and Robust Forest %A Alexander Hanbo Li %A Andrew Martin %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-li17e %I PMLR %P 2091--2100 %U https://proceedings.mlr.press/v70/li17e.html %V 70 %X This paper introduces a new general framework for forest-type regression which allows the development of robust forest regressors by selecting from a large family of robust loss functions. In particular, when plugged in the squared error and quantile losses, it will recover the classical random forest and quantile random forest. We then use robust loss functions to develop more robust forest-type regression algorithms. In the experiments, we show by simulation and real data that our robust forests are indeed much more insensitive to outliers, and choosing the right number of nearest neighbors can quickly improve the generalization performance of random forest.
APA
Li, A.H. & Martin, A.. (2017). Forest-type Regression with General Losses and Robust Forest. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:2091-2100 Available from https://proceedings.mlr.press/v70/li17e.html.

Related Material