Quantum Tensor Networks, Stochastic Processes, and Weighted Automata

Sandesh Adhikary, Siddarth Srinivasan, Jacob Miller, Guillaume Rabusseau, Byron Boots
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:2080-2088, 2021.

Abstract

Modeling joint probability distributions over sequences has been studied from many perspectives. The physics community developed matrix product states, a tensor-train decomposition for probabilistic modeling, motivated by the need to tractably model many-body systems. But similar models have also been studied in the stochastic processes and weighted automata literature, with little work on how these bodies of work relate to each other. We address this gap by showing how stationary or uniform versions of popular quantum tensor network models have equivalent representations in the stochastic processes and weighted automata literature, in the limit of infinitely long sequences. We demonstrate several equivalence results between models used in these three communities: (i) uniform variants of matrix product states, Born machines and locally purified states from the quantum tensor networks literature, (ii) predictive state representations, hidden Markov models, norm-observable operator models and hidden quantum Markov models from the stochastic process literature, and (iii) stochastic weighted automata, probabilistic automata and quadratic automata from the formal languages literature. Such connections may open the door for results and methods developed in one area to be applied in another.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-adhikary21a, title = { Quantum Tensor Networks, Stochastic Processes, and Weighted Automata }, author = {Adhikary, Sandesh and Srinivasan, Siddarth and Miller, Jacob and Rabusseau, Guillaume and Boots, Byron}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {2080--2088}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/adhikary21a/adhikary21a.pdf}, url = {https://proceedings.mlr.press/v130/adhikary21a.html}, abstract = { Modeling joint probability distributions over sequences has been studied from many perspectives. The physics community developed matrix product states, a tensor-train decomposition for probabilistic modeling, motivated by the need to tractably model many-body systems. But similar models have also been studied in the stochastic processes and weighted automata literature, with little work on how these bodies of work relate to each other. We address this gap by showing how stationary or uniform versions of popular quantum tensor network models have equivalent representations in the stochastic processes and weighted automata literature, in the limit of infinitely long sequences. We demonstrate several equivalence results between models used in these three communities: (i) uniform variants of matrix product states, Born machines and locally purified states from the quantum tensor networks literature, (ii) predictive state representations, hidden Markov models, norm-observable operator models and hidden quantum Markov models from the stochastic process literature, and (iii) stochastic weighted automata, probabilistic automata and quadratic automata from the formal languages literature. Such connections may open the door for results and methods developed in one area to be applied in another. } }
Endnote
%0 Conference Paper %T Quantum Tensor Networks, Stochastic Processes, and Weighted Automata %A Sandesh Adhikary %A Siddarth Srinivasan %A Jacob Miller %A Guillaume Rabusseau %A Byron Boots %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-adhikary21a %I PMLR %P 2080--2088 %U https://proceedings.mlr.press/v130/adhikary21a.html %V 130 %X Modeling joint probability distributions over sequences has been studied from many perspectives. The physics community developed matrix product states, a tensor-train decomposition for probabilistic modeling, motivated by the need to tractably model many-body systems. But similar models have also been studied in the stochastic processes and weighted automata literature, with little work on how these bodies of work relate to each other. We address this gap by showing how stationary or uniform versions of popular quantum tensor network models have equivalent representations in the stochastic processes and weighted automata literature, in the limit of infinitely long sequences. We demonstrate several equivalence results between models used in these three communities: (i) uniform variants of matrix product states, Born machines and locally purified states from the quantum tensor networks literature, (ii) predictive state representations, hidden Markov models, norm-observable operator models and hidden quantum Markov models from the stochastic process literature, and (iii) stochastic weighted automata, probabilistic automata and quadratic automata from the formal languages literature. Such connections may open the door for results and methods developed in one area to be applied in another.
APA
Adhikary, S., Srinivasan, S., Miller, J., Rabusseau, G. & Boots, B.. (2021). Quantum Tensor Networks, Stochastic Processes, and Weighted Automata . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:2080-2088 Available from https://proceedings.mlr.press/v130/adhikary21a.html.

Related Material