[edit]
Learning Linear Bayesian Networks with Latent Variables
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(1):249-257, 2013.
Abstract
This work considers the problem of learning linear Bayesian networks when some of the variables are unobserved. Identifiability and efficient recovery from low-order observable moments are established under a novel graphical constraint. The constraint concerns the expansion properties of the underlying directed acyclic graph (DAG) between observed and unobserved variables in the network, and it is satisfied by many natural families of DAGs that include multi-level DAGs, DAGs with effective depth one, as well as certain families of polytrees.