Sparse Causal Discovery in Multivariate Time Series

Stefan Haufe, Klaus-Robert Müller, Guido Nolte, Nicole Krämer
Proceedings of Workshop on Causality: Objectives and Assessment at NIPS 2008, PMLR 6:97-106, 2010.

Abstract

Our goal is to estimate causal interactions in multivariate time series. Using vector autoregressive (VAR) models, these can be defined based on non-vanishing coefficients belonging to respective time-lagged instances. As in most cases a parsimonious causality structure is assumed, a promising approach to causal discovery consists in fitting VAR models with an additional sparsity-promoting regularization. Along this line we here propose that sparsity should be enforced for the subgroups of coefficients that belong to each pair of time series, as the absence of a causal relation requires the coefficients for all time-lags to become jointly zero. Such behavior can be achieved by means of \emphl_1,2-norm regularized regression, for which an efficient active set solver has been proposed recently. Our method is shown to outperform standard methods in recovering simulated causality graphs. The results are on par with a second novel approach which uses multiple statistical testing.

Cite this Paper


BibTeX
@InProceedings{pmlr-v6-haufe10a, title = {Sparse Causal Discovery in Multivariate Time Series}, author = {Haufe, Stefan and Müller, Klaus-Robert and Nolte, Guido and Krämer, Nicole}, booktitle = {Proceedings of Workshop on Causality: Objectives and Assessment at NIPS 2008}, pages = {97--106}, year = {2010}, editor = {Guyon, Isabelle and Janzing, Dominik and Schölkopf, Bernhard}, volume = {6}, series = {Proceedings of Machine Learning Research}, address = {Whistler, Canada}, month = {12 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v6/haufe10a/haufe10a.pdf}, url = {https://proceedings.mlr.press/v6/haufe10a.html}, abstract = {Our goal is to estimate causal interactions in multivariate time series. Using vector autoregressive (VAR) models, these can be defined based on non-vanishing coefficients belonging to respective time-lagged instances. As in most cases a parsimonious causality structure is assumed, a promising approach to causal discovery consists in fitting VAR models with an additional sparsity-promoting regularization. Along this line we here propose that sparsity should be enforced for the subgroups of coefficients that belong to each pair of time series, as the absence of a causal relation requires the coefficients for all time-lags to become jointly zero. Such behavior can be achieved by means of \emphl_1,2-norm regularized regression, for which an efficient active set solver has been proposed recently. Our method is shown to outperform standard methods in recovering simulated causality graphs. The results are on par with a second novel approach which uses multiple statistical testing.} }
Endnote
%0 Conference Paper %T Sparse Causal Discovery in Multivariate Time Series %A Stefan Haufe %A Klaus-Robert Müller %A Guido Nolte %A Nicole Krämer %B Proceedings of Workshop on Causality: Objectives and Assessment at NIPS 2008 %C Proceedings of Machine Learning Research %D 2010 %E Isabelle Guyon %E Dominik Janzing %E Bernhard Schölkopf %F pmlr-v6-haufe10a %I PMLR %P 97--106 %U https://proceedings.mlr.press/v6/haufe10a.html %V 6 %X Our goal is to estimate causal interactions in multivariate time series. Using vector autoregressive (VAR) models, these can be defined based on non-vanishing coefficients belonging to respective time-lagged instances. As in most cases a parsimonious causality structure is assumed, a promising approach to causal discovery consists in fitting VAR models with an additional sparsity-promoting regularization. Along this line we here propose that sparsity should be enforced for the subgroups of coefficients that belong to each pair of time series, as the absence of a causal relation requires the coefficients for all time-lags to become jointly zero. Such behavior can be achieved by means of \emphl_1,2-norm regularized regression, for which an efficient active set solver has been proposed recently. Our method is shown to outperform standard methods in recovering simulated causality graphs. The results are on par with a second novel approach which uses multiple statistical testing.
RIS
TY - CPAPER TI - Sparse Causal Discovery in Multivariate Time Series AU - Stefan Haufe AU - Klaus-Robert Müller AU - Guido Nolte AU - Nicole Krämer BT - Proceedings of Workshop on Causality: Objectives and Assessment at NIPS 2008 DA - 2010/02/18 ED - Isabelle Guyon ED - Dominik Janzing ED - Bernhard Schölkopf ID - pmlr-v6-haufe10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 6 SP - 97 EP - 106 L1 - http://proceedings.mlr.press/v6/haufe10a/haufe10a.pdf UR - https://proceedings.mlr.press/v6/haufe10a.html AB - Our goal is to estimate causal interactions in multivariate time series. Using vector autoregressive (VAR) models, these can be defined based on non-vanishing coefficients belonging to respective time-lagged instances. As in most cases a parsimonious causality structure is assumed, a promising approach to causal discovery consists in fitting VAR models with an additional sparsity-promoting regularization. Along this line we here propose that sparsity should be enforced for the subgroups of coefficients that belong to each pair of time series, as the absence of a causal relation requires the coefficients for all time-lags to become jointly zero. Such behavior can be achieved by means of \emphl_1,2-norm regularized regression, for which an efficient active set solver has been proposed recently. Our method is shown to outperform standard methods in recovering simulated causality graphs. The results are on par with a second novel approach which uses multiple statistical testing. ER -
APA
Haufe, S., Müller, K., Nolte, G. & Krämer, N.. (2010). Sparse Causal Discovery in Multivariate Time Series. Proceedings of Workshop on Causality: Objectives and Assessment at NIPS 2008, in Proceedings of Machine Learning Research 6:97-106 Available from https://proceedings.mlr.press/v6/haufe10a.html.

Related Material