Discriminative Non-Parametric Learning of Arithmetic Circuits

Nandini Ramanan, Mayukh Das, Kristian Kersting, Sriraam Natarajan
Proceedings of the 10th International Conference on Probabilistic Graphical Models, PMLR 138:353-364, 2020.

Abstract

Arithmetic Circuits (AC) and Sum-Product Networks (SPN) have recently gained significant interest by virtue of being tractable deep probabilistic models. We propose the first gradient-boosted method for structure learning of discriminative ACs (DACs), called DACBOOST. In discrete domains ACs are essentially equivalent to mixtures of trees, thus DACBOOST decomposes a large AC into smaller tree-structured ACs and learns them in sequential, additive manner. The resulting non-parametric manner of learning DACs results in a model with very few tuning parameters making our learned model significantly more efficient. We demonstrate on standard data sets and real data sets, efficiency of DACBOOST compared to state-of-the-art DAC learners without sacrificing effectiveness.

Cite this Paper


BibTeX
@InProceedings{pmlr-v138-ramanan20a, title = { Discriminative Non-Parametric Learning of Arithmetic Circuits}, author = {Ramanan, Nandini and Das, Mayukh and Kersting, Kristian and Natarajan, Sriraam}, booktitle = {Proceedings of the 10th International Conference on Probabilistic Graphical Models}, pages = {353--364}, year = {2020}, editor = {Jaeger, Manfred and Nielsen, Thomas Dyhre}, volume = {138}, series = {Proceedings of Machine Learning Research}, month = {23--25 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v138/ramanan20a/ramanan20a.pdf}, url = {https://proceedings.mlr.press/v138/ramanan20a.html}, abstract = { Arithmetic Circuits (AC) and Sum-Product Networks (SPN) have recently gained significant interest by virtue of being tractable deep probabilistic models. We propose the first gradient-boosted method for structure learning of discriminative ACs (DACs), called DACBOOST. In discrete domains ACs are essentially equivalent to mixtures of trees, thus DACBOOST decomposes a large AC into smaller tree-structured ACs and learns them in sequential, additive manner. The resulting non-parametric manner of learning DACs results in a model with very few tuning parameters making our learned model significantly more efficient. We demonstrate on standard data sets and real data sets, efficiency of DACBOOST compared to state-of-the-art DAC learners without sacrificing effectiveness.} }
Endnote
%0 Conference Paper %T Discriminative Non-Parametric Learning of Arithmetic Circuits %A Nandini Ramanan %A Mayukh Das %A Kristian Kersting %A Sriraam Natarajan %B Proceedings of the 10th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2020 %E Manfred Jaeger %E Thomas Dyhre Nielsen %F pmlr-v138-ramanan20a %I PMLR %P 353--364 %U https://proceedings.mlr.press/v138/ramanan20a.html %V 138 %X Arithmetic Circuits (AC) and Sum-Product Networks (SPN) have recently gained significant interest by virtue of being tractable deep probabilistic models. We propose the first gradient-boosted method for structure learning of discriminative ACs (DACs), called DACBOOST. In discrete domains ACs are essentially equivalent to mixtures of trees, thus DACBOOST decomposes a large AC into smaller tree-structured ACs and learns them in sequential, additive manner. The resulting non-parametric manner of learning DACs results in a model with very few tuning parameters making our learned model significantly more efficient. We demonstrate on standard data sets and real data sets, efficiency of DACBOOST compared to state-of-the-art DAC learners without sacrificing effectiveness.
APA
Ramanan, N., Das, M., Kersting, K. & Natarajan, S.. (2020). Discriminative Non-Parametric Learning of Arithmetic Circuits. Proceedings of the 10th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 138:353-364 Available from https://proceedings.mlr.press/v138/ramanan20a.html.

Related Material