NGBoost: Natural Gradient Boosting for Probabilistic Prediction

Tony Duan, Avati Anand, Daisy Yi Ding, Khanh K. Thai, Sanjay Basu, Andrew Ng, Alejandro Schuler
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:2690-2700, 2020.

Abstract

We present Natural Gradient Boosting (NGBoost), an algorithm for generic probabilistic prediction via gradient boosting. Typical regression models return a point estimate, conditional on covariates, but probabilistic regression models output a full probability distribution over the outcome space, conditional on the covariates. This allows for predictive uncertainty estimation - crucial in applications like healthcare and weather forecasting. NGBoost generalizes gradient boosting to probabilistic regression by treating the parameters of the conditional distribution as targets for a multiparameter boosting algorithm. Furthermore, we show how the Natural Gradient is required to correct the training dynamics of our multiparameter boosting approach. NGBoost can be used with any base learner, any family of distributions with continuous parameters, and any scoring rule. NGBoost matches or exceeds the performance of existing methods for probabilistic prediction while offering additional benefits in flexibility, scalability, and usability. An open-source implementation is available at github.com/stanfordmlgroup/ngboost.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-duan20a, title = {{NGB}oost: Natural Gradient Boosting for Probabilistic Prediction}, author = {Duan, Tony and Anand, Avati and Ding, Daisy Yi and Thai, Khanh K. and Basu, Sanjay and Ng, Andrew and Schuler, Alejandro}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {2690--2700}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/duan20a/duan20a.pdf}, url = {https://proceedings.mlr.press/v119/duan20a.html}, abstract = {We present Natural Gradient Boosting (NGBoost), an algorithm for generic probabilistic prediction via gradient boosting. Typical regression models return a point estimate, conditional on covariates, but probabilistic regression models output a full probability distribution over the outcome space, conditional on the covariates. This allows for predictive uncertainty estimation - crucial in applications like healthcare and weather forecasting. NGBoost generalizes gradient boosting to probabilistic regression by treating the parameters of the conditional distribution as targets for a multiparameter boosting algorithm. Furthermore, we show how the Natural Gradient is required to correct the training dynamics of our multiparameter boosting approach. NGBoost can be used with any base learner, any family of distributions with continuous parameters, and any scoring rule. NGBoost matches or exceeds the performance of existing methods for probabilistic prediction while offering additional benefits in flexibility, scalability, and usability. An open-source implementation is available at github.com/stanfordmlgroup/ngboost.} }
Endnote
%0 Conference Paper %T NGBoost: Natural Gradient Boosting for Probabilistic Prediction %A Tony Duan %A Avati Anand %A Daisy Yi Ding %A Khanh K. Thai %A Sanjay Basu %A Andrew Ng %A Alejandro Schuler %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-duan20a %I PMLR %P 2690--2700 %U https://proceedings.mlr.press/v119/duan20a.html %V 119 %X We present Natural Gradient Boosting (NGBoost), an algorithm for generic probabilistic prediction via gradient boosting. Typical regression models return a point estimate, conditional on covariates, but probabilistic regression models output a full probability distribution over the outcome space, conditional on the covariates. This allows for predictive uncertainty estimation - crucial in applications like healthcare and weather forecasting. NGBoost generalizes gradient boosting to probabilistic regression by treating the parameters of the conditional distribution as targets for a multiparameter boosting algorithm. Furthermore, we show how the Natural Gradient is required to correct the training dynamics of our multiparameter boosting approach. NGBoost can be used with any base learner, any family of distributions with continuous parameters, and any scoring rule. NGBoost matches or exceeds the performance of existing methods for probabilistic prediction while offering additional benefits in flexibility, scalability, and usability. An open-source implementation is available at github.com/stanfordmlgroup/ngboost.
APA
Duan, T., Anand, A., Ding, D.Y., Thai, K.K., Basu, S., Ng, A. & Schuler, A.. (2020). NGBoost: Natural Gradient Boosting for Probabilistic Prediction. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:2690-2700 Available from https://proceedings.mlr.press/v119/duan20a.html.

Related Material