Time–Approximation Trade-Offs for Learning Bayesian Networks

Madhumita Kundu, Pekka Parviainen, Saket Saurabh
Proceedings of The 12th International Conference on Probabilistic Graphical Models, PMLR 246:486-497, 2024.

Abstract

Bayesian network structure learning is an NP-hard problem. Furthermore, the problem remains hard even for various subclasses of graphs. Motivated by the hardness of exact learning, we study approximation algorithms for learning Bayesian networks. First, we propose a moderately exponential time algorithm with running time $\mathcal{O}(2^{\frac{\ell}{r}n})$ that has an approximation ratio $\frac{\ell}{r}$ where $n$ is the number of vertices and $\ell$ and $r$ are user-defined parameters with $\ell\leq r$. That is, we give time–approximation trade-offs for learning Bayesian networks. Second, we present a polynomial time algorithm with an approximation ratio $\frac{1}{d}$ to find an optimal graph whose connected components have size at most $d$.

Cite this Paper


BibTeX
@InProceedings{pmlr-v246-kundu24a, title = {Time–Approximation Trade-Offs for Learning Bayesian Networks}, author = {Kundu, Madhumita and Parviainen, Pekka and Saurabh, Saket}, booktitle = {Proceedings of The 12th International Conference on Probabilistic Graphical Models}, pages = {486--497}, year = {2024}, editor = {Kwisthout, Johan and Renooij, Silja}, volume = {246}, series = {Proceedings of Machine Learning Research}, month = {11--13 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v246/main/assets/kundu24a/kundu24a.pdf}, url = {https://proceedings.mlr.press/v246/kundu24a.html}, abstract = {Bayesian network structure learning is an NP-hard problem. Furthermore, the problem remains hard even for various subclasses of graphs. Motivated by the hardness of exact learning, we study approximation algorithms for learning Bayesian networks. First, we propose a moderately exponential time algorithm with running time $\mathcal{O}(2^{\frac{\ell}{r}n})$ that has an approximation ratio $\frac{\ell}{r}$ where $n$ is the number of vertices and $\ell$ and $r$ are user-defined parameters with $\ell\leq r$. That is, we give time–approximation trade-offs for learning Bayesian networks. Second, we present a polynomial time algorithm with an approximation ratio $\frac{1}{d}$ to find an optimal graph whose connected components have size at most $d$. } }
Endnote
%0 Conference Paper %T Time–Approximation Trade-Offs for Learning Bayesian Networks %A Madhumita Kundu %A Pekka Parviainen %A Saket Saurabh %B Proceedings of The 12th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2024 %E Johan Kwisthout %E Silja Renooij %F pmlr-v246-kundu24a %I PMLR %P 486--497 %U https://proceedings.mlr.press/v246/kundu24a.html %V 246 %X Bayesian network structure learning is an NP-hard problem. Furthermore, the problem remains hard even for various subclasses of graphs. Motivated by the hardness of exact learning, we study approximation algorithms for learning Bayesian networks. First, we propose a moderately exponential time algorithm with running time $\mathcal{O}(2^{\frac{\ell}{r}n})$ that has an approximation ratio $\frac{\ell}{r}$ where $n$ is the number of vertices and $\ell$ and $r$ are user-defined parameters with $\ell\leq r$. That is, we give time–approximation trade-offs for learning Bayesian networks. Second, we present a polynomial time algorithm with an approximation ratio $\frac{1}{d}$ to find an optimal graph whose connected components have size at most $d$.
APA
Kundu, M., Parviainen, P. & Saurabh, S.. (2024). Time–Approximation Trade-Offs for Learning Bayesian Networks. Proceedings of The 12th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 246:486-497 Available from https://proceedings.mlr.press/v246/kundu24a.html.

Related Material