An Efficient Low-Rank Tensors Representation for Algorithms in Complex Probabilistic Graphical Models

Gaspard Ducamp, Philippe Bonnard, Anthony pages = 173-184 Nouy, Pierre-Henri Wuillemin
Proceedings of the 10th International Conference on Probabilistic Graphical Models, PMLR 138:173-184, 2020.

Abstract

Probabilistic Graphical Models form a class of compact representations of high-dimensional probability distributions by decomposing these distributions in a set of multivariate factors (potentials). Every exact algorithm (for probabilistic inference, MAP, etc.) operates on a specific representation of these potentials. However complex probabilistic models often lead to very large potentials which dramatically impact both the space and time complexities of these algorithms and which can make inference in complex models intractable. In this paper we propose a new approach based on low-rank tensor representation to approximate and operate with these potentials. The low-rank tensor representation is used for the approximation of potentials with controlled precision and an important reduction in the number of parameters. Every operator used in such algorithms (multiplication, addition, projection, etc.) can be defined within this representation, leading to an approximation framework where every algorithm for PGMs can be easily implemented. As an instance of this framework, we present a classical message passing algorithm in Bayesian networks using the tensor train format. By reducing significantly the computational complexity and the memory usage, the proposed approach makes probabilistic inference much more scalable. These results are illustrated by experiments on dynamic Bayesian networks and classical Bayesian networks performed using a Python implementation with TensorFlow, T3F and pyAgrum.

Cite this Paper


BibTeX
@InProceedings{pmlr-v138-ducamp20b, title = {An Efficient Low-Rank Tensors Representation for Algorithms in Complex Probabilistic Graphical Models}, author = {Ducamp, Gaspard and Bonnard, Philippe and Nouy, Anthony and Wuillemin, Pierre-Henri}, booktitle = {Proceedings of the 10th International Conference on Probabilistic Graphical Models}, pages = {173--184}, year = {2020}, editor = {Jaeger, Manfred and Nielsen, Thomas Dyhre}, volume = {138}, series = {Proceedings of Machine Learning Research}, month = {23--25 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v138/ducamp20b/ducamp20b.pdf}, url = {https://proceedings.mlr.press/v138/ducamp20b.html}, abstract = {Probabilistic Graphical Models form a class of compact representations of high-dimensional probability distributions by decomposing these distributions in a set of multivariate factors (potentials). Every exact algorithm (for probabilistic inference, MAP, etc.) operates on a specific representation of these potentials. However complex probabilistic models often lead to very large potentials which dramatically impact both the space and time complexities of these algorithms and which can make inference in complex models intractable. In this paper we propose a new approach based on low-rank tensor representation to approximate and operate with these potentials. The low-rank tensor representation is used for the approximation of potentials with controlled precision and an important reduction in the number of parameters. Every operator used in such algorithms (multiplication, addition, projection, etc.) can be defined within this representation, leading to an approximation framework where every algorithm for PGMs can be easily implemented. As an instance of this framework, we present a classical message passing algorithm in Bayesian networks using the tensor train format. By reducing significantly the computational complexity and the memory usage, the proposed approach makes probabilistic inference much more scalable. These results are illustrated by experiments on dynamic Bayesian networks and classical Bayesian networks performed using a Python implementation with TensorFlow, T3F and pyAgrum.} }
Endnote
%0 Conference Paper %T An Efficient Low-Rank Tensors Representation for Algorithms in Complex Probabilistic Graphical Models %A Gaspard Ducamp %A Philippe Bonnard %A Anthony pages = 173-184 Nouy %A Pierre-Henri Wuillemin %B Proceedings of the 10th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2020 %E Manfred Jaeger %E Thomas Dyhre Nielsen %F pmlr-v138-ducamp20b %I PMLR %P 173--184 %U https://proceedings.mlr.press/v138/ducamp20b.html %V 138 %X Probabilistic Graphical Models form a class of compact representations of high-dimensional probability distributions by decomposing these distributions in a set of multivariate factors (potentials). Every exact algorithm (for probabilistic inference, MAP, etc.) operates on a specific representation of these potentials. However complex probabilistic models often lead to very large potentials which dramatically impact both the space and time complexities of these algorithms and which can make inference in complex models intractable. In this paper we propose a new approach based on low-rank tensor representation to approximate and operate with these potentials. The low-rank tensor representation is used for the approximation of potentials with controlled precision and an important reduction in the number of parameters. Every operator used in such algorithms (multiplication, addition, projection, etc.) can be defined within this representation, leading to an approximation framework where every algorithm for PGMs can be easily implemented. As an instance of this framework, we present a classical message passing algorithm in Bayesian networks using the tensor train format. By reducing significantly the computational complexity and the memory usage, the proposed approach makes probabilistic inference much more scalable. These results are illustrated by experiments on dynamic Bayesian networks and classical Bayesian networks performed using a Python implementation with TensorFlow, T3F and pyAgrum.
APA
Ducamp, G., Bonnard, P., Nouy, A.p.=.1. & Wuillemin, P.. (2020). An Efficient Low-Rank Tensors Representation for Algorithms in Complex Probabilistic Graphical Models. Proceedings of the 10th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 138:173-184 Available from https://proceedings.mlr.press/v138/ducamp20b.html.

Related Material