Boosted Histogram Transform for Regression

Yuchao Cai, Hanyuan Hang, Hanfang Yang, Zhouchen Lin
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1251-1261, 2020.

Abstract

In this paper, we propose a boosting algorithm for regression problems called \emph{boosted histogram transform for regression} (BHTR) based on histogram transforms composed of random rotations, stretchings, and translations. From the theoretical perspective, we first prove fast convergence rates for BHTR under the assumption that the target function lies in the spaces $C^{0,\alpha}$. Moreover, if the target function resides in the subspace $C^{1,\alpha}$, by establishing the upper bound of the convergence rate for the boosted regressor, i.e. BHTR, and the lower bound for base regressors, i.e. histogram transform regressors (HTR), we manage to explain the benefits of the boosting procedure. In the experiments, compared with other state-of-the-art algorithms such as gradient boosted regression tree (GBRT), Breiman’s forest, and kernel-based methods, our BHTR algorithm shows promising performance on both synthetic and real datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-cai20a, title = {Boosted Histogram Transform for Regression}, author = {Cai, Yuchao and Hang, Hanyuan and Yang, Hanfang and Lin, Zhouchen}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1251--1261}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/cai20a/cai20a.pdf}, url = { http://proceedings.mlr.press/v119/cai20a.html }, abstract = {In this paper, we propose a boosting algorithm for regression problems called \emph{boosted histogram transform for regression} (BHTR) based on histogram transforms composed of random rotations, stretchings, and translations. From the theoretical perspective, we first prove fast convergence rates for BHTR under the assumption that the target function lies in the spaces $C^{0,\alpha}$. Moreover, if the target function resides in the subspace $C^{1,\alpha}$, by establishing the upper bound of the convergence rate for the boosted regressor, i.e. BHTR, and the lower bound for base regressors, i.e. histogram transform regressors (HTR), we manage to explain the benefits of the boosting procedure. In the experiments, compared with other state-of-the-art algorithms such as gradient boosted regression tree (GBRT), Breiman’s forest, and kernel-based methods, our BHTR algorithm shows promising performance on both synthetic and real datasets.} }
Endnote
%0 Conference Paper %T Boosted Histogram Transform for Regression %A Yuchao Cai %A Hanyuan Hang %A Hanfang Yang %A Zhouchen Lin %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-cai20a %I PMLR %P 1251--1261 %U http://proceedings.mlr.press/v119/cai20a.html %V 119 %X In this paper, we propose a boosting algorithm for regression problems called \emph{boosted histogram transform for regression} (BHTR) based on histogram transforms composed of random rotations, stretchings, and translations. From the theoretical perspective, we first prove fast convergence rates for BHTR under the assumption that the target function lies in the spaces $C^{0,\alpha}$. Moreover, if the target function resides in the subspace $C^{1,\alpha}$, by establishing the upper bound of the convergence rate for the boosted regressor, i.e. BHTR, and the lower bound for base regressors, i.e. histogram transform regressors (HTR), we manage to explain the benefits of the boosting procedure. In the experiments, compared with other state-of-the-art algorithms such as gradient boosted regression tree (GBRT), Breiman’s forest, and kernel-based methods, our BHTR algorithm shows promising performance on both synthetic and real datasets.
APA
Cai, Y., Hang, H., Yang, H. & Lin, Z.. (2020). Boosted Histogram Transform for Regression. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1251-1261 Available from http://proceedings.mlr.press/v119/cai20a.html .

Related Material