An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models

Masakazu Ishihata, Yoshitaka Kameya, Taisuke Sato, Shin-ichi Minato
; Proceedings of 2nd Asian Conference on Machine Learning, JMLR Workshop and Conference Proceedings 13:161-176, 2010.

Abstract

Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates.

Cite this Paper


BibTeX
@InProceedings{pmlr-v13-ishihata10a, title = {An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models}, author = {Masakazu Ishihata and Yoshitaka Kameya and Taisuke Sato and Shin-ichi Minato}, booktitle = {Proceedings of 2nd Asian Conference on Machine Learning}, pages = {161--176}, year = {2010}, editor = {Masashi Sugiyama and Qiang Yang}, volume = {13}, series = {Proceedings of Machine Learning Research}, address = {Tokyo, Japan}, month = {08--10 Nov}, publisher = {JMLR Workshop and Conference Proceedings}, pdf = {http://proceedings.mlr.press/v13/ishihata10a/ishihata10a.pdf}, url = {http://proceedings.mlr.press/v13/ishihata10a.html}, abstract = {Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates.} }
Endnote
%0 Conference Paper %T An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models %A Masakazu Ishihata %A Yoshitaka Kameya %A Taisuke Sato %A Shin-ichi Minato %B Proceedings of 2nd Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2010 %E Masashi Sugiyama %E Qiang Yang %F pmlr-v13-ishihata10a %I PMLR %J Proceedings of Machine Learning Research %P 161--176 %U http://proceedings.mlr.press %V 13 %W PMLR %X Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates.
RIS
TY - CPAPER TI - An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models AU - Masakazu Ishihata AU - Yoshitaka Kameya AU - Taisuke Sato AU - Shin-ichi Minato BT - Proceedings of 2nd Asian Conference on Machine Learning PY - 2010/10/31 DA - 2010/10/31 ED - Masashi Sugiyama ED - Qiang Yang ID - pmlr-v13-ishihata10a PB - PMLR SP - 161 DP - PMLR EP - 176 L1 - http://proceedings.mlr.press/v13/ishihata10a/ishihata10a.pdf UR - http://proceedings.mlr.press/v13/ishihata10a.html AB - Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates. ER -
APA
Ishihata, M., Kameya, Y., Sato, T. & Minato, S.. (2010). An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models. Proceedings of 2nd Asian Conference on Machine Learning, in PMLR 13:161-176

Related Material