Collapsed Variational Inference for Sum-Product Networks

Han Zhao, Tameem Adel, Geoff Gordon, Brandon Amos
; Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1310-1318, 2016.

Abstract

Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-zhaoa16, title = {Collapsed Variational Inference for Sum-Product Networks}, author = {Han Zhao and Tameem Adel and Geoff Gordon and Brandon Amos}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1310--1318}, year = {2016}, editor = {Maria Florina Balcan and Kilian Q. Weinberger}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/zhaoa16.pdf}, url = {http://proceedings.mlr.press/v48/zhaoa16.html}, abstract = {Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach.} }
Endnote
%0 Conference Paper %T Collapsed Variational Inference for Sum-Product Networks %A Han Zhao %A Tameem Adel %A Geoff Gordon %A Brandon Amos %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-zhaoa16 %I PMLR %J Proceedings of Machine Learning Research %P 1310--1318 %U http://proceedings.mlr.press %V 48 %W PMLR %X Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach.
RIS
TY - CPAPER TI - Collapsed Variational Inference for Sum-Product Networks AU - Han Zhao AU - Tameem Adel AU - Geoff Gordon AU - Brandon Amos BT - Proceedings of The 33rd International Conference on Machine Learning PY - 2016/06/11 DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-zhaoa16 PB - PMLR SP - 1310 DP - PMLR EP - 1318 L1 - http://proceedings.mlr.press/v48/zhaoa16.pdf UR - http://proceedings.mlr.press/v48/zhaoa16.html AB - Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach. ER -
APA
Zhao, H., Adel, T., Gordon, G. & Amos, B.. (2016). Collapsed Variational Inference for Sum-Product Networks. Proceedings of The 33rd International Conference on Machine Learning, in PMLR 48:1310-1318

Related Material