Heuristic Greedy Search Algorithms for Latent Variable Models

Peter Spirtes, Thomas S.Richardson, Christopher Meek
Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, PMLR R1:481-488, 1997.

Abstract

There has recently been significant progress in the development of algorithms for learning the directed acyclic graph (DAG) part of a Bayesian network without latent variables from data and optional background knowledge. However, the problem of learning the DAG part of a Bayesian network with latent (unmeasured) variables is much more difficult for two reasons: first the number of possible models is infinite, and second, calculating scores for latent variables models is generally much slower than calculating scores for models without latent variables. In this paper we will describe how to extend search algorithms developed for non-latent variable DAG models to the case of DAG models with latent variables. We will introduce two generalizations of DAGs, called mixed ancestor graphs (or MAGs) and partial ancestor graphs (or PAGs), and briefly describe how they can be used to search for latent variable DAG models, to classify, and to predict the effects of interventions in causal systems.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR1-spirtes97a, title = {Heuristic Greedy Search Algorithms for Latent Variable Models}, author = {Spirtes, Peter and S.Richardson, Thomas and Meek, Christopher}, booktitle = {Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics}, pages = {481--488}, year = {1997}, editor = {Madigan, David and Smyth, Padhraic}, volume = {R1}, series = {Proceedings of Machine Learning Research}, month = {04--07 Jan}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/r1/spirtes97a/spirtes97a.pdf}, url = {https://proceedings.mlr.press/r1/spirtes97a.html}, abstract = {There has recently been significant progress in the development of algorithms for learning the directed acyclic graph (DAG) part of a Bayesian network without latent variables from data and optional background knowledge. However, the problem of learning the DAG part of a Bayesian network with latent (unmeasured) variables is much more difficult for two reasons: first the number of possible models is infinite, and second, calculating scores for latent variables models is generally much slower than calculating scores for models without latent variables. In this paper we will describe how to extend search algorithms developed for non-latent variable DAG models to the case of DAG models with latent variables. We will introduce two generalizations of DAGs, called mixed ancestor graphs (or MAGs) and partial ancestor graphs (or PAGs), and briefly describe how they can be used to search for latent variable DAG models, to classify, and to predict the effects of interventions in causal systems.}, note = {Reissued by PMLR on 30 March 2021.} }
Endnote
%0 Conference Paper %T Heuristic Greedy Search Algorithms for Latent Variable Models %A Peter Spirtes %A Thomas S.Richardson %A Christopher Meek %B Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 1997 %E David Madigan %E Padhraic Smyth %F pmlr-vR1-spirtes97a %I PMLR %P 481--488 %U https://proceedings.mlr.press/r1/spirtes97a.html %V R1 %X There has recently been significant progress in the development of algorithms for learning the directed acyclic graph (DAG) part of a Bayesian network without latent variables from data and optional background knowledge. However, the problem of learning the DAG part of a Bayesian network with latent (unmeasured) variables is much more difficult for two reasons: first the number of possible models is infinite, and second, calculating scores for latent variables models is generally much slower than calculating scores for models without latent variables. In this paper we will describe how to extend search algorithms developed for non-latent variable DAG models to the case of DAG models with latent variables. We will introduce two generalizations of DAGs, called mixed ancestor graphs (or MAGs) and partial ancestor graphs (or PAGs), and briefly describe how they can be used to search for latent variable DAG models, to classify, and to predict the effects of interventions in causal systems. %Z Reissued by PMLR on 30 March 2021.
APA
Spirtes, P., S.Richardson, T. & Meek, C.. (1997). Heuristic Greedy Search Algorithms for Latent Variable Models. Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research R1:481-488 Available from https://proceedings.mlr.press/r1/spirtes97a.html. Reissued by PMLR on 30 March 2021.

Related Material