Residual Sum-Product Networks

Fabrizio Ventola, Karl Stelzner, Alejandro Molina, Kristian Kersting
Proceedings of the 10th International Conference on Probabilistic Graphical Models, PMLR 138:545-556, 2020.

Abstract

Tractable yet expressive density estimators are a key building block of probabilistic machine learning. While sum-product networks (SPNs) offer attractive inference capabilities, obtaining structures large enough to fit complex, high-dimensional data has proven challenging. In this paper, we present a residual learning approach to ease the learning of SPNs, which are deeper and wider than those used previously. The main trick is to ensemble SPNs by explicitly reformulating sum nodes as residual functions. This adds references to substructures across the SPNs at different depths, which in turn helps to improve training. Our experiments demonstrate that the resulting residual SPNs (ResSPNs) are easy to optimize, gain performance from considerably increased depth and width, and are competitive to state of-the-art SPN structure learning approaches. To combat overfitting, we introduce an iterative pruning technique that compacts models and yields better generalization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v138-ventola20a, title = {Residual Sum-Product Networks}, author = {Ventola, Fabrizio and Stelzner, Karl and Molina, Alejandro and Kersting, Kristian}, booktitle = {Proceedings of the 10th International Conference on Probabilistic Graphical Models}, pages = {545--556}, year = {2020}, editor = {Jaeger, Manfred and Nielsen, Thomas Dyhre}, volume = {138}, series = {Proceedings of Machine Learning Research}, month = {23--25 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v138/ventola20a/ventola20a.pdf}, url = {https://proceedings.mlr.press/v138/ventola20a.html}, abstract = {Tractable yet expressive density estimators are a key building block of probabilistic machine learning. While sum-product networks (SPNs) offer attractive inference capabilities, obtaining structures large enough to fit complex, high-dimensional data has proven challenging. In this paper, we present a residual learning approach to ease the learning of SPNs, which are deeper and wider than those used previously. The main trick is to ensemble SPNs by explicitly reformulating sum nodes as residual functions. This adds references to substructures across the SPNs at different depths, which in turn helps to improve training. Our experiments demonstrate that the resulting residual SPNs (ResSPNs) are easy to optimize, gain performance from considerably increased depth and width, and are competitive to state of-the-art SPN structure learning approaches. To combat overfitting, we introduce an iterative pruning technique that compacts models and yields better generalization.} }
Endnote
%0 Conference Paper %T Residual Sum-Product Networks %A Fabrizio Ventola %A Karl Stelzner %A Alejandro Molina %A Kristian Kersting %B Proceedings of the 10th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2020 %E Manfred Jaeger %E Thomas Dyhre Nielsen %F pmlr-v138-ventola20a %I PMLR %P 545--556 %U https://proceedings.mlr.press/v138/ventola20a.html %V 138 %X Tractable yet expressive density estimators are a key building block of probabilistic machine learning. While sum-product networks (SPNs) offer attractive inference capabilities, obtaining structures large enough to fit complex, high-dimensional data has proven challenging. In this paper, we present a residual learning approach to ease the learning of SPNs, which are deeper and wider than those used previously. The main trick is to ensemble SPNs by explicitly reformulating sum nodes as residual functions. This adds references to substructures across the SPNs at different depths, which in turn helps to improve training. Our experiments demonstrate that the resulting residual SPNs (ResSPNs) are easy to optimize, gain performance from considerably increased depth and width, and are competitive to state of-the-art SPN structure learning approaches. To combat overfitting, we introduce an iterative pruning technique that compacts models and yields better generalization.
APA
Ventola, F., Stelzner, K., Molina, A. & Kersting, K.. (2020). Residual Sum-Product Networks. Proceedings of the 10th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 138:545-556 Available from https://proceedings.mlr.press/v138/ventola20a.html.

Related Material