Boosting Methodology for Regression Problems

Greg Ridgeway, David Madigan, Thomas S. Richardson
Proceedings of the Seventh International Workshop on Artificial Intelligence and Statistics, PMLR R2, 1999.

Abstract

Classification problems have dominated research on boosting to date. The application of boosting to regression problems, on the other hand, has received little investigation. In this paper we develop a new boosting method for regression problems. We cast the regression problem as a classification problem and apply an interpretable form of the boosted naïve Bayes classifier. This induces a regression model that we show to be expressible as an additive model for which we derive estimators and discuss computational issues. We compare the performance of our boosted naïve Bayes regression model with other interpretable multivariate regression procedures.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR2-ridgeway99a, title = {Boosting Methodology for Regression Problems}, author = {Ridgeway, Greg and Madigan, David and Richardson, Thomas S.}, booktitle = {Proceedings of the Seventh International Workshop on Artificial Intelligence and Statistics}, year = {1999}, editor = {Heckerman, David and Whittaker, Joe}, volume = {R2}, series = {Proceedings of Machine Learning Research}, month = {03--06 Jan}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/r2/ridgeway99a/ridgeway99a.pdf}, url = {https://proceedings.mlr.press/r2/ridgeway99a.html}, abstract = {Classification problems have dominated research on boosting to date. The application of boosting to regression problems, on the other hand, has received little investigation. In this paper we develop a new boosting method for regression problems. We cast the regression problem as a classification problem and apply an interpretable form of the boosted naïve Bayes classifier. This induces a regression model that we show to be expressible as an additive model for which we derive estimators and discuss computational issues. We compare the performance of our boosted naïve Bayes regression model with other interpretable multivariate regression procedures.}, note = {Reissued by PMLR on 20 August 2020.} }
Endnote
%0 Conference Paper %T Boosting Methodology for Regression Problems %A Greg Ridgeway %A David Madigan %A Thomas S. Richardson %B Proceedings of the Seventh International Workshop on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 1999 %E David Heckerman %E Joe Whittaker %F pmlr-vR2-ridgeway99a %I PMLR %U https://proceedings.mlr.press/r2/ridgeway99a.html %V R2 %X Classification problems have dominated research on boosting to date. The application of boosting to regression problems, on the other hand, has received little investigation. In this paper we develop a new boosting method for regression problems. We cast the regression problem as a classification problem and apply an interpretable form of the boosted naïve Bayes classifier. This induces a regression model that we show to be expressible as an additive model for which we derive estimators and discuss computational issues. We compare the performance of our boosted naïve Bayes regression model with other interpretable multivariate regression procedures. %Z Reissued by PMLR on 20 August 2020.
APA
Ridgeway, G., Madigan, D. & Richardson, T.S.. (1999). Boosting Methodology for Regression Problems. Proceedings of the Seventh International Workshop on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research R2 Available from https://proceedings.mlr.press/r2/ridgeway99a.html. Reissued by PMLR on 20 August 2020.

Related Material