Sparse Learning in Gaussian Chain Graphs for State Space Models

Lasse Petersen
Proceedings of the Ninth International Conference on Probabilistic Graphical Models, PMLR 72:332-343, 2018.

Abstract

The graphical lasso is a popular method for estimating the structure of undirected Gaussian graphical models from data by penalized maximum likelihood. This paper extends the idea of structure estimation of graphical models by penalized maximum likelihood to Gaussian chain graph models for state space models. First we show how the class of linear Gaussian state space models can be interpreted in the chain graph set-up under both the LWF and AMP Markov properties, and we demonstrate how sparsity of the chain graph structure relates to sparsity of the model parameters. Exploiting this relation we propose two different penalized maximum likelihood estimators for recovering the chain graph structure from data depending on the Markov interpretation at hand. We frame the penalized maximum likelihood problem in a missing data set-up and carry out estimation in each of the two cases using the EM algorithm. The common E-step is solved by smoothing, and we solve the two different M-steps by utilizing existing methods from high dimensional statistics and convex optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v72-petersen18a, title = {Sparse Learning in Gaussian Chain Graphs for State Space Models}, author = {Petersen, Lasse}, booktitle = {Proceedings of the Ninth International Conference on Probabilistic Graphical Models}, pages = {332--343}, year = {2018}, editor = {Kratochvíl, Václav and Studený, Milan}, volume = {72}, series = {Proceedings of Machine Learning Research}, month = {11--14 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v72/petersen18a/petersen18a.pdf}, url = {https://proceedings.mlr.press/v72/petersen18a.html}, abstract = {The graphical lasso is a popular method for estimating the structure of undirected Gaussian graphical models from data by penalized maximum likelihood. This paper extends the idea of structure estimation of graphical models by penalized maximum likelihood to Gaussian chain graph models for state space models. First we show how the class of linear Gaussian state space models can be interpreted in the chain graph set-up under both the LWF and AMP Markov properties, and we demonstrate how sparsity of the chain graph structure relates to sparsity of the model parameters. Exploiting this relation we propose two different penalized maximum likelihood estimators for recovering the chain graph structure from data depending on the Markov interpretation at hand. We frame the penalized maximum likelihood problem in a missing data set-up and carry out estimation in each of the two cases using the EM algorithm. The common E-step is solved by smoothing, and we solve the two different M-steps by utilizing existing methods from high dimensional statistics and convex optimization.} }
Endnote
%0 Conference Paper %T Sparse Learning in Gaussian Chain Graphs for State Space Models %A Lasse Petersen %B Proceedings of the Ninth International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2018 %E Václav Kratochvíl %E Milan Studený %F pmlr-v72-petersen18a %I PMLR %P 332--343 %U https://proceedings.mlr.press/v72/petersen18a.html %V 72 %X The graphical lasso is a popular method for estimating the structure of undirected Gaussian graphical models from data by penalized maximum likelihood. This paper extends the idea of structure estimation of graphical models by penalized maximum likelihood to Gaussian chain graph models for state space models. First we show how the class of linear Gaussian state space models can be interpreted in the chain graph set-up under both the LWF and AMP Markov properties, and we demonstrate how sparsity of the chain graph structure relates to sparsity of the model parameters. Exploiting this relation we propose two different penalized maximum likelihood estimators for recovering the chain graph structure from data depending on the Markov interpretation at hand. We frame the penalized maximum likelihood problem in a missing data set-up and carry out estimation in each of the two cases using the EM algorithm. The common E-step is solved by smoothing, and we solve the two different M-steps by utilizing existing methods from high dimensional statistics and convex optimization.
APA
Petersen, L.. (2018). Sparse Learning in Gaussian Chain Graphs for State Space Models. Proceedings of the Ninth International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 72:332-343 Available from https://proceedings.mlr.press/v72/petersen18a.html.

Related Material