Causal Structure Learning via Temporal Markov Networks

Aubrey Barnard, David Page
Proceedings of the Ninth International Conference on Probabilistic Graphical Models, PMLR 72:13-24, 2018.

Abstract

Learning the structure of a dynamic Bayesian network (DBN) is a common way of discovering causal relationships in time series data. However, the combinatorial nature of DBN structure learning limits the accuracy and scalability of DBN modeling. We propose to avoid these limits by learning structure with log-linear temporal Markov networks (TMNs). Using TMNs replaces the combinatorial optimization problem with a continuous, convex one, which can be solved quickly with gradient methods. Furthermore, representing the data in terms of features gives TMNs an advantage in modeling the dynamics of sequences with irregular, sparse, or noisy events. Compared to representative DBN structure learners, TMNs run faster while performing as accurately on synthetic tasks and a real-world task of causal discovery in electronic medical records.

Cite this Paper


BibTeX
@InProceedings{pmlr-v72-barnard18a, title = {Causal Structure Learning via Temporal Markov Networks}, author = {Barnard, Aubrey and Page, David}, booktitle = {Proceedings of the Ninth International Conference on Probabilistic Graphical Models}, pages = {13--24}, year = {2018}, editor = {Kratochvíl, Václav and Studený, Milan}, volume = {72}, series = {Proceedings of Machine Learning Research}, month = {11--14 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v72/barnard18a/barnard18a.pdf}, url = {https://proceedings.mlr.press/v72/barnard18a.html}, abstract = {Learning the structure of a dynamic Bayesian network (DBN) is a common way of discovering causal relationships in time series data. However, the combinatorial nature of DBN structure learning limits the accuracy and scalability of DBN modeling. We propose to avoid these limits by learning structure with log-linear temporal Markov networks (TMNs). Using TMNs replaces the combinatorial optimization problem with a continuous, convex one, which can be solved quickly with gradient methods. Furthermore, representing the data in terms of features gives TMNs an advantage in modeling the dynamics of sequences with irregular, sparse, or noisy events. Compared to representative DBN structure learners, TMNs run faster while performing as accurately on synthetic tasks and a real-world task of causal discovery in electronic medical records.} }
Endnote
%0 Conference Paper %T Causal Structure Learning via Temporal Markov Networks %A Aubrey Barnard %A David Page %B Proceedings of the Ninth International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2018 %E Václav Kratochvíl %E Milan Studený %F pmlr-v72-barnard18a %I PMLR %P 13--24 %U https://proceedings.mlr.press/v72/barnard18a.html %V 72 %X Learning the structure of a dynamic Bayesian network (DBN) is a common way of discovering causal relationships in time series data. However, the combinatorial nature of DBN structure learning limits the accuracy and scalability of DBN modeling. We propose to avoid these limits by learning structure with log-linear temporal Markov networks (TMNs). Using TMNs replaces the combinatorial optimization problem with a continuous, convex one, which can be solved quickly with gradient methods. Furthermore, representing the data in terms of features gives TMNs an advantage in modeling the dynamics of sequences with irregular, sparse, or noisy events. Compared to representative DBN structure learners, TMNs run faster while performing as accurately on synthetic tasks and a real-world task of causal discovery in electronic medical records.
APA
Barnard, A. & Page, D.. (2018). Causal Structure Learning via Temporal Markov Networks. Proceedings of the Ninth International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 72:13-24 Available from https://proceedings.mlr.press/v72/barnard18a.html.

Related Material