Robust RegBayes: Selectively Incorporating First-Order Logic Domain Knowledge into Bayesian Models

Shike Mei, Jun Zhu, Jerry Zhu
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):253-261, 2014.

Abstract

Much research in Bayesian modeling has been done to elicit a prior distribution that incorporates domain knowledge. We present a novel and more direct approach by imposing First-Order Logic (FOL) rules on the posterior distribution. Our approach unifies FOL and Bayesian modeling under the regularized Bayesian framework. In addition, our approach automatically estimates the uncertainty of FOL rules when they are produced by humans, so that reliable rules are incorporated while unreliable ones are ignored. We apply our approach to latent topic modeling tasks and demonstrate that by combining FOL knowledge and Bayesian modeling, we both improve the task performance and discover more structured latent representations in unsupervised and supervised learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-mei14, title = {Robust RegBayes: Selectively Incorporating First-Order Logic Domain Knowledge into Bayesian Models}, author = {Mei, Shike and Zhu, Jun and Zhu, Jerry}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {253--261}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/mei14.pdf}, url = {https://proceedings.mlr.press/v32/mei14.html}, abstract = {Much research in Bayesian modeling has been done to elicit a prior distribution that incorporates domain knowledge. We present a novel and more direct approach by imposing First-Order Logic (FOL) rules on the posterior distribution. Our approach unifies FOL and Bayesian modeling under the regularized Bayesian framework. In addition, our approach automatically estimates the uncertainty of FOL rules when they are produced by humans, so that reliable rules are incorporated while unreliable ones are ignored. We apply our approach to latent topic modeling tasks and demonstrate that by combining FOL knowledge and Bayesian modeling, we both improve the task performance and discover more structured latent representations in unsupervised and supervised learning.} }
Endnote
%0 Conference Paper %T Robust RegBayes: Selectively Incorporating First-Order Logic Domain Knowledge into Bayesian Models %A Shike Mei %A Jun Zhu %A Jerry Zhu %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-mei14 %I PMLR %P 253--261 %U https://proceedings.mlr.press/v32/mei14.html %V 32 %N 1 %X Much research in Bayesian modeling has been done to elicit a prior distribution that incorporates domain knowledge. We present a novel and more direct approach by imposing First-Order Logic (FOL) rules on the posterior distribution. Our approach unifies FOL and Bayesian modeling under the regularized Bayesian framework. In addition, our approach automatically estimates the uncertainty of FOL rules when they are produced by humans, so that reliable rules are incorporated while unreliable ones are ignored. We apply our approach to latent topic modeling tasks and demonstrate that by combining FOL knowledge and Bayesian modeling, we both improve the task performance and discover more structured latent representations in unsupervised and supervised learning.
RIS
TY - CPAPER TI - Robust RegBayes: Selectively Incorporating First-Order Logic Domain Knowledge into Bayesian Models AU - Shike Mei AU - Jun Zhu AU - Jerry Zhu BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-mei14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 253 EP - 261 L1 - http://proceedings.mlr.press/v32/mei14.pdf UR - https://proceedings.mlr.press/v32/mei14.html AB - Much research in Bayesian modeling has been done to elicit a prior distribution that incorporates domain knowledge. We present a novel and more direct approach by imposing First-Order Logic (FOL) rules on the posterior distribution. Our approach unifies FOL and Bayesian modeling under the regularized Bayesian framework. In addition, our approach automatically estimates the uncertainty of FOL rules when they are produced by humans, so that reliable rules are incorporated while unreliable ones are ignored. We apply our approach to latent topic modeling tasks and demonstrate that by combining FOL knowledge and Bayesian modeling, we both improve the task performance and discover more structured latent representations in unsupervised and supervised learning. ER -
APA
Mei, S., Zhu, J. & Zhu, J.. (2014). Robust RegBayes: Selectively Incorporating First-Order Logic Domain Knowledge into Bayesian Models. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):253-261 Available from https://proceedings.mlr.press/v32/mei14.html.

Related Material