Multiple DAGs Learning with Non-negative Matrix Factorization
; Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks, PMLR 73:81-92, 2017.
Probabilistic graphical models, e.g., Markov network and Bayesian network have been well studied in the past two decades. However, it is still difficult to learn a reliable network structure, especially with limited data. Recent works found multi-task learning can improve the robustness of the learned networks by leveraging data from related tasks. In this paper, we focus on the estimation of Direct Acyclic Graph (DAG) of Bayesian network. Most existing multi-task or transfer learning algorithms for Bayesian network use the DAG relatedness as an inductive bias in the optimization of multiple structures. More specifically, some works firstly find shared hidden structures among related tasks, and then treat them as the structure penalties in the learning step. However, current works omit the setting that the shared hidden structure comes from different parts of different DAGs. Thus, in this paper, the Non-negative Matrix Factorization (NMF) is employed to learn a parts-based representation to mediate this problem. Theoretically, we show the plausibility of our approach. Empirically, we show that compared to single task learning, multi-task learning is better able to positively identify true edges with synthetic data and real-world landmine data.