On Recovering from Modeling Errors Using Testing Bayesian Networks

Haiying Huang, Adnan Darwiche
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:4402-4411, 2021.

Abstract

We consider the problem of supervised learning with Bayesian Networks when the used dependency structure is incomplete due to missing edges or missing variable states. These modeling errors induce independence constraints on the learned model that may not hold in the true, data-generating distribution. We provide a unified treatment of these modeling errors as instances of state-space abstractions. We then identify a class of Bayesian Networks and queries which allow one to fully recover from such modeling errors if one can choose Conditional Probability Tables (CPTs) dynamically based on evidence. We show theoretically that the recently proposed Testing Bayesian Networks (TBNs), which can be trained by compiling them into Testing Arithmetic Circuits (TACs), provide a promising construct for emulating this CPT selection mechanism. Finally, we present empirical results that illustrate the promise of TBNs as a tool for recovering from certain modeling errors in the context of supervised learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-huang21a, title = {On Recovering from Modeling Errors Using Testing Bayesian Networks}, author = {Huang, Haiying and Darwiche, Adnan}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {4402--4411}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/huang21a/huang21a.pdf}, url = {https://proceedings.mlr.press/v139/huang21a.html}, abstract = {We consider the problem of supervised learning with Bayesian Networks when the used dependency structure is incomplete due to missing edges or missing variable states. These modeling errors induce independence constraints on the learned model that may not hold in the true, data-generating distribution. We provide a unified treatment of these modeling errors as instances of state-space abstractions. We then identify a class of Bayesian Networks and queries which allow one to fully recover from such modeling errors if one can choose Conditional Probability Tables (CPTs) dynamically based on evidence. We show theoretically that the recently proposed Testing Bayesian Networks (TBNs), which can be trained by compiling them into Testing Arithmetic Circuits (TACs), provide a promising construct for emulating this CPT selection mechanism. Finally, we present empirical results that illustrate the promise of TBNs as a tool for recovering from certain modeling errors in the context of supervised learning.} }
Endnote
%0 Conference Paper %T On Recovering from Modeling Errors Using Testing Bayesian Networks %A Haiying Huang %A Adnan Darwiche %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-huang21a %I PMLR %P 4402--4411 %U https://proceedings.mlr.press/v139/huang21a.html %V 139 %X We consider the problem of supervised learning with Bayesian Networks when the used dependency structure is incomplete due to missing edges or missing variable states. These modeling errors induce independence constraints on the learned model that may not hold in the true, data-generating distribution. We provide a unified treatment of these modeling errors as instances of state-space abstractions. We then identify a class of Bayesian Networks and queries which allow one to fully recover from such modeling errors if one can choose Conditional Probability Tables (CPTs) dynamically based on evidence. We show theoretically that the recently proposed Testing Bayesian Networks (TBNs), which can be trained by compiling them into Testing Arithmetic Circuits (TACs), provide a promising construct for emulating this CPT selection mechanism. Finally, we present empirical results that illustrate the promise of TBNs as a tool for recovering from certain modeling errors in the context of supervised learning.
APA
Huang, H. & Darwiche, A.. (2021). On Recovering from Modeling Errors Using Testing Bayesian Networks. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:4402-4411 Available from https://proceedings.mlr.press/v139/huang21a.html.

Related Material