Black-Box Alpha Divergence Minimization

Jose Hernandez-Lobato, Yingzhen Li, Mark Rowland, Thang Bui, Daniel Hernandez-Lobato, Richard Turner
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1511-1520, 2016.

Abstract

Black-box alpha (BB-α) is a new approximate inference method based on the minimization of α-divergences. BB-αscales to large datasets because it can be implemented using stochastic gradient descent. BB-αcan be applied to complex probabilistic models with little effort since it only requires as input the likelihood function and its gradients. These gradients can be easily obtained using automatic differentiation. By changing the divergence parameter α, the method is able to interpolate between variational Bayes (VB) (α→0) and an algorithm similar to expectation propagation (EP) (α= 1). Experiments on probit regression and neural network regression and classification problems show that BB-αwith non-standard settings of α, such as α= 0.5, usually produces better predictions than with α→0 (VB) or α= 1 (EP).

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-hernandez-lobatob16, title = {Black-Box Alpha Divergence Minimization}, author = {Hernandez-Lobato, Jose and Li, Yingzhen and Rowland, Mark and Bui, Thang and Hernandez-Lobato, Daniel and Turner, Richard}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1511--1520}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/hernandez-lobatob16.pdf}, url = {https://proceedings.mlr.press/v48/hernandez-lobatob16.html}, abstract = {Black-box alpha (BB-α) is a new approximate inference method based on the minimization of α-divergences. BB-αscales to large datasets because it can be implemented using stochastic gradient descent. BB-αcan be applied to complex probabilistic models with little effort since it only requires as input the likelihood function and its gradients. These gradients can be easily obtained using automatic differentiation. By changing the divergence parameter α, the method is able to interpolate between variational Bayes (VB) (α→0) and an algorithm similar to expectation propagation (EP) (α= 1). Experiments on probit regression and neural network regression and classification problems show that BB-αwith non-standard settings of α, such as α= 0.5, usually produces better predictions than with α→0 (VB) or α= 1 (EP).} }
Endnote
%0 Conference Paper %T Black-Box Alpha Divergence Minimization %A Jose Hernandez-Lobato %A Yingzhen Li %A Mark Rowland %A Thang Bui %A Daniel Hernandez-Lobato %A Richard Turner %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-hernandez-lobatob16 %I PMLR %P 1511--1520 %U https://proceedings.mlr.press/v48/hernandez-lobatob16.html %V 48 %X Black-box alpha (BB-α) is a new approximate inference method based on the minimization of α-divergences. BB-αscales to large datasets because it can be implemented using stochastic gradient descent. BB-αcan be applied to complex probabilistic models with little effort since it only requires as input the likelihood function and its gradients. These gradients can be easily obtained using automatic differentiation. By changing the divergence parameter α, the method is able to interpolate between variational Bayes (VB) (α→0) and an algorithm similar to expectation propagation (EP) (α= 1). Experiments on probit regression and neural network regression and classification problems show that BB-αwith non-standard settings of α, such as α= 0.5, usually produces better predictions than with α→0 (VB) or α= 1 (EP).
RIS
TY - CPAPER TI - Black-Box Alpha Divergence Minimization AU - Jose Hernandez-Lobato AU - Yingzhen Li AU - Mark Rowland AU - Thang Bui AU - Daniel Hernandez-Lobato AU - Richard Turner BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-hernandez-lobatob16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1511 EP - 1520 L1 - http://proceedings.mlr.press/v48/hernandez-lobatob16.pdf UR - https://proceedings.mlr.press/v48/hernandez-lobatob16.html AB - Black-box alpha (BB-α) is a new approximate inference method based on the minimization of α-divergences. BB-αscales to large datasets because it can be implemented using stochastic gradient descent. BB-αcan be applied to complex probabilistic models with little effort since it only requires as input the likelihood function and its gradients. These gradients can be easily obtained using automatic differentiation. By changing the divergence parameter α, the method is able to interpolate between variational Bayes (VB) (α→0) and an algorithm similar to expectation propagation (EP) (α= 1). Experiments on probit regression and neural network regression and classification problems show that BB-αwith non-standard settings of α, such as α= 0.5, usually produces better predictions than with α→0 (VB) or α= 1 (EP). ER -
APA
Hernandez-Lobato, J., Li, Y., Rowland, M., Bui, T., Hernandez-Lobato, D. & Turner, R.. (2016). Black-Box Alpha Divergence Minimization. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1511-1520 Available from https://proceedings.mlr.press/v48/hernandez-lobatob16.html.

Related Material