Conditional Sum-Product Networks: Imposing Structure on Deep Probabilistic Architectures

Xiaoting Shao, Alejandro Molina, Antonio Vergari, Karl Stelzner, Robert Peharz, Thomas Liebig, Kristian Kersting
Proceedings of the 10th International Conference on Probabilistic Graphical Models, PMLR 138:401-412, 2020.

Abstract

Probabilistic graphical models are a central tool in AI, however, they are generally not as expressive as deep neural models, and inference is notoriously hard and slow. In contrast, deep probabilistic models such as sum-product networks (SPNs) capture joint distributions in a tractable fashion, but still lack the expressive power of intractable models based on deep neural networks. Therefore, we introduce conditional SPNs (CSPNs), conditional density estimators for multivariate and potentially hybrid domains that allow harnessing the expressive power of neural networks while still maintaining tractability guarantees. One way to implement CSPNs is to use an existing SPN structure and condition its parameters on the input, e.g., via a deep neural network. Our experimental evidence demonstrates that CSPNs are competitive with other probabilistic models and yield superior performance on multilabel image classification compared to mean field and mixture density networks. Furthermore, they can successfully be employed as building blocks for structured probabilistic models, such as autoregressive image models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v138-shao20a, title = {Conditional Sum-Product Networks: Imposing Structure on Deep Probabilistic Architectures}, author = {Shao, Xiaoting and Molina, Alejandro and Vergari, Antonio and Stelzner, Karl and Peharz, Robert and Liebig, Thomas and Kersting, Kristian}, booktitle = {Proceedings of the 10th International Conference on Probabilistic Graphical Models}, pages = {401--412}, year = {2020}, editor = {Jaeger, Manfred and Nielsen, Thomas Dyhre}, volume = {138}, series = {Proceedings of Machine Learning Research}, month = {23--25 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v138/shao20a/shao20a.pdf}, url = {http://proceedings.mlr.press/v138/shao20a.html}, abstract = {Probabilistic graphical models are a central tool in AI, however, they are generally not as expressive as deep neural models, and inference is notoriously hard and slow. In contrast, deep probabilistic models such as sum-product networks (SPNs) capture joint distributions in a tractable fashion, but still lack the expressive power of intractable models based on deep neural networks. Therefore, we introduce conditional SPNs (CSPNs), conditional density estimators for multivariate and potentially hybrid domains that allow harnessing the expressive power of neural networks while still maintaining tractability guarantees. One way to implement CSPNs is to use an existing SPN structure and condition its parameters on the input, e.g., via a deep neural network. Our experimental evidence demonstrates that CSPNs are competitive with other probabilistic models and yield superior performance on multilabel image classification compared to mean field and mixture density networks. Furthermore, they can successfully be employed as building blocks for structured probabilistic models, such as autoregressive image models.} }
Endnote
%0 Conference Paper %T Conditional Sum-Product Networks: Imposing Structure on Deep Probabilistic Architectures %A Xiaoting Shao %A Alejandro Molina %A Antonio Vergari %A Karl Stelzner %A Robert Peharz %A Thomas Liebig %A Kristian Kersting %B Proceedings of the 10th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2020 %E Manfred Jaeger %E Thomas Dyhre Nielsen %F pmlr-v138-shao20a %I PMLR %P 401--412 %U http://proceedings.mlr.press/v138/shao20a.html %V 138 %X Probabilistic graphical models are a central tool in AI, however, they are generally not as expressive as deep neural models, and inference is notoriously hard and slow. In contrast, deep probabilistic models such as sum-product networks (SPNs) capture joint distributions in a tractable fashion, but still lack the expressive power of intractable models based on deep neural networks. Therefore, we introduce conditional SPNs (CSPNs), conditional density estimators for multivariate and potentially hybrid domains that allow harnessing the expressive power of neural networks while still maintaining tractability guarantees. One way to implement CSPNs is to use an existing SPN structure and condition its parameters on the input, e.g., via a deep neural network. Our experimental evidence demonstrates that CSPNs are competitive with other probabilistic models and yield superior performance on multilabel image classification compared to mean field and mixture density networks. Furthermore, they can successfully be employed as building blocks for structured probabilistic models, such as autoregressive image models.
APA
Shao, X., Molina, A., Vergari, A., Stelzner, K., Peharz, R., Liebig, T. & Kersting, K.. (2020). Conditional Sum-Product Networks: Imposing Structure on Deep Probabilistic Architectures. Proceedings of the 10th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 138:401-412 Available from http://proceedings.mlr.press/v138/shao20a.html.

Related Material