Collapsed Variational Inference for Sum-Product Networks

Han Zhao, Tameem Adel, Geoff Gordon, Brandon Amos
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1310-1318, 2016.

Abstract

Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-zhaoa16, title = {Collapsed Variational Inference for Sum-Product Networks}, author = {Zhao, Han and Adel, Tameem and Gordon, Geoff and Amos, Brandon}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1310--1318}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/zhaoa16.pdf}, url = {https://proceedings.mlr.press/v48/zhaoa16.html}, abstract = {Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach.} }
Endnote
%0 Conference Paper %T Collapsed Variational Inference for Sum-Product Networks %A Han Zhao %A Tameem Adel %A Geoff Gordon %A Brandon Amos %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-zhaoa16 %I PMLR %P 1310--1318 %U https://proceedings.mlr.press/v48/zhaoa16.html %V 48 %X Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach.
RIS
TY - CPAPER TI - Collapsed Variational Inference for Sum-Product Networks AU - Han Zhao AU - Tameem Adel AU - Geoff Gordon AU - Brandon Amos BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-zhaoa16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1310 EP - 1318 L1 - http://proceedings.mlr.press/v48/zhaoa16.pdf UR - https://proceedings.mlr.press/v48/zhaoa16.html AB - Sum-Product Networks (SPNs) are probabilistic inference machines that admit exact inference in linear time in the size of the network. Existing parameter learning approaches for SPNs are largely based on the maximum likelihood principle and are subject to overfitting compared to more Bayesian approaches. Exact Bayesian posterior inference for SPNs is computationally intractable. Even approximation techniques such as standard variational inference and posterior sampling for SPNs are computationally infeasible even for networks of moderate size due to the large number of local latent variables per instance. In this work, we propose a novel deterministic collapsed variational inference algorithm for SPNs that is computationally efficient, easy to implement and at the same time allows us to incorporate prior information into the optimization formulation. Extensive experiments show a significant improvement in accuracy compared with a maximum likelihood based approach. ER -
APA
Zhao, H., Adel, T., Gordon, G. & Amos, B.. (2016). Collapsed Variational Inference for Sum-Product Networks. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1310-1318 Available from https://proceedings.mlr.press/v48/zhaoa16.html.

Related Material