Multiple DAGs Learning with Non-negative Matrix Factorization

Yun Zhou, Jiang Wang, Cheng Zhu, Weiming Zhang
; Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks, PMLR 73:81-92, 2017.

Abstract

Probabilistic graphical models, e.g., Markov network and Bayesian network have been well studied in the past two decades. However, it is still difficult to learn a reliable network structure, especially with limited data. Recent works found multi-task learning can improve the robustness of the learned networks by leveraging data from related tasks. In this paper, we focus on the estimation of Direct Acyclic Graph (DAG) of Bayesian network. Most existing multi-task or transfer learning algorithms for Bayesian network use the DAG relatedness as an inductive bias in the optimization of multiple structures. More specifically, some works firstly find shared hidden structures among related tasks, and then treat them as the structure penalties in the learning step. However, current works omit the setting that the shared hidden structure comes from different parts of different DAGs. Thus, in this paper, the Non-negative Matrix Factorization (NMF) is employed to learn a parts-based representation to mediate this problem. Theoretically, we show the plausibility of our approach. Empirically, we show that compared to single task learning, multi-task learning is better able to positively identify true edges with synthetic data and real-world landmine data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v73-zhou17a, title = {Multiple DAGs Learning with Non-negative Matrix Factorization}, author = {Yun Zhou and Jiang Wang and Cheng Zhu and Weiming Zhang}, booktitle = {Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks}, pages = {81--92}, year = {2017}, editor = {Antti Hyttinen and Joe Suzuki and Brandon Malone}, volume = {73}, series = {Proceedings of Machine Learning Research}, month = {20--22 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v73/zhou17a/zhou17a.pdf}, url = {http://proceedings.mlr.press/v73/zhou17a.html}, abstract = {Probabilistic graphical models, e.g., Markov network and Bayesian network have been well studied in the past two decades. However, it is still difficult to learn a reliable network structure, especially with limited data. Recent works found multi-task learning can improve the robustness of the learned networks by leveraging data from related tasks. In this paper, we focus on the estimation of Direct Acyclic Graph (DAG) of Bayesian network. Most existing multi-task or transfer learning algorithms for Bayesian network use the DAG relatedness as an inductive bias in the optimization of multiple structures. More specifically, some works firstly find shared hidden structures among related tasks, and then treat them as the structure penalties in the learning step. However, current works omit the setting that the shared hidden structure comes from different parts of different DAGs. Thus, in this paper, the Non-negative Matrix Factorization (NMF) is employed to learn a parts-based representation to mediate this problem. Theoretically, we show the plausibility of our approach. Empirically, we show that compared to single task learning, multi-task learning is better able to positively identify true edges with synthetic data and real-world landmine data. } }
Endnote
%0 Conference Paper %T Multiple DAGs Learning with Non-negative Matrix Factorization %A Yun Zhou %A Jiang Wang %A Cheng Zhu %A Weiming Zhang %B Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks %C Proceedings of Machine Learning Research %D 2017 %E Antti Hyttinen %E Joe Suzuki %E Brandon Malone %F pmlr-v73-zhou17a %I PMLR %J Proceedings of Machine Learning Research %P 81--92 %U http://proceedings.mlr.press %V 73 %W PMLR %X Probabilistic graphical models, e.g., Markov network and Bayesian network have been well studied in the past two decades. However, it is still difficult to learn a reliable network structure, especially with limited data. Recent works found multi-task learning can improve the robustness of the learned networks by leveraging data from related tasks. In this paper, we focus on the estimation of Direct Acyclic Graph (DAG) of Bayesian network. Most existing multi-task or transfer learning algorithms for Bayesian network use the DAG relatedness as an inductive bias in the optimization of multiple structures. More specifically, some works firstly find shared hidden structures among related tasks, and then treat them as the structure penalties in the learning step. However, current works omit the setting that the shared hidden structure comes from different parts of different DAGs. Thus, in this paper, the Non-negative Matrix Factorization (NMF) is employed to learn a parts-based representation to mediate this problem. Theoretically, we show the plausibility of our approach. Empirically, we show that compared to single task learning, multi-task learning is better able to positively identify true edges with synthetic data and real-world landmine data.
APA
Zhou, Y., Wang, J., Zhu, C. & Zhang, W.. (2017). Multiple DAGs Learning with Non-negative Matrix Factorization. Proceedings of The 3rd International Workshop on Advanced Methodologies for Bayesian Networks, in PMLR 73:81-92

Related Material