Supervised Learning with Background Knowledge

Yizuo Chen, Arthur Choi, Adnan Darwiche
Proceedings of the 10th International Conference on Probabilistic Graphical Models, PMLR 138:89-100, 2020.

Abstract

We consider the task of supervised learning while focusing on the impact that background knowledge may have on the accuracy and robustness of learned classifiers. We consider three types of background knowledge: causal domain knowledge, functional dependencies and logical constraints. Our findings are set in the context of an empirical study that compares two classes of classifiers: Arithmetic Circuit (AC) classifiers compiled from Bayesian network models with varying degrees of background knowledge, and Convolutional Neural Network (CNN) classifiers. We report on the accuracy and robustness of such classifiers on two tasks concerned with recognizing synthesized shapes in noisy images. We show that classifiers that encode background knowledge need much less data to attain certain accuracies and are more robust against noise level in the data and also against mismatches between noise patterns in the training and testing data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v138-chen20c, title = {Supervised Learning with Background Knowledge}, author = {Chen, Yizuo and Choi, Arthur and Darwiche, Adnan}, booktitle = {Proceedings of the 10th International Conference on Probabilistic Graphical Models}, pages = {89--100}, year = {2020}, editor = {Jaeger, Manfred and Nielsen, Thomas Dyhre}, volume = {138}, series = {Proceedings of Machine Learning Research}, month = {23--25 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v138/chen20c/chen20c.pdf}, url = {https://proceedings.mlr.press/v138/chen20c.html}, abstract = {We consider the task of supervised learning while focusing on the impact that background knowledge may have on the accuracy and robustness of learned classifiers. We consider three types of background knowledge: causal domain knowledge, functional dependencies and logical constraints. Our findings are set in the context of an empirical study that compares two classes of classifiers: Arithmetic Circuit (AC) classifiers compiled from Bayesian network models with varying degrees of background knowledge, and Convolutional Neural Network (CNN) classifiers. We report on the accuracy and robustness of such classifiers on two tasks concerned with recognizing synthesized shapes in noisy images. We show that classifiers that encode background knowledge need much less data to attain certain accuracies and are more robust against noise level in the data and also against mismatches between noise patterns in the training and testing data.} }
Endnote
%0 Conference Paper %T Supervised Learning with Background Knowledge %A Yizuo Chen %A Arthur Choi %A Adnan Darwiche %B Proceedings of the 10th International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2020 %E Manfred Jaeger %E Thomas Dyhre Nielsen %F pmlr-v138-chen20c %I PMLR %P 89--100 %U https://proceedings.mlr.press/v138/chen20c.html %V 138 %X We consider the task of supervised learning while focusing on the impact that background knowledge may have on the accuracy and robustness of learned classifiers. We consider three types of background knowledge: causal domain knowledge, functional dependencies and logical constraints. Our findings are set in the context of an empirical study that compares two classes of classifiers: Arithmetic Circuit (AC) classifiers compiled from Bayesian network models with varying degrees of background knowledge, and Convolutional Neural Network (CNN) classifiers. We report on the accuracy and robustness of such classifiers on two tasks concerned with recognizing synthesized shapes in noisy images. We show that classifiers that encode background knowledge need much less data to attain certain accuracies and are more robust against noise level in the data and also against mismatches between noise patterns in the training and testing data.
APA
Chen, Y., Choi, A. & Darwiche, A.. (2020). Supervised Learning with Background Knowledge. Proceedings of the 10th International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 138:89-100 Available from https://proceedings.mlr.press/v138/chen20c.html.

Related Material