An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models

Masakazu Ishihata, Yoshitaka Kameya, Taisuke Sato, Shin-ichi Minato
Proceedings of 2nd Asian Conference on Machine Learning, PMLR 13:161-176, 2010.

Abstract

Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates.

Cite this Paper


BibTeX
@InProceedings{pmlr-v13-ishihata10a, title = {An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models}, author = {Ishihata, Masakazu and Kameya, Yoshitaka and Sato, Taisuke and Minato, Shin-ichi}, booktitle = {Proceedings of 2nd Asian Conference on Machine Learning}, pages = {161--176}, year = {2010}, editor = {Sugiyama, Masashi and Yang, Qiang}, volume = {13}, series = {Proceedings of Machine Learning Research}, address = {Tokyo, Japan}, month = {08--10 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v13/ishihata10a/ishihata10a.pdf}, url = {https://proceedings.mlr.press/v13/ishihata10a.html}, abstract = {Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates.} }
Endnote
%0 Conference Paper %T An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models %A Masakazu Ishihata %A Yoshitaka Kameya %A Taisuke Sato %A Shin-ichi Minato %B Proceedings of 2nd Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2010 %E Masashi Sugiyama %E Qiang Yang %F pmlr-v13-ishihata10a %I PMLR %P 161--176 %U https://proceedings.mlr.press/v13/ishihata10a.html %V 13 %X Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates.
RIS
TY - CPAPER TI - An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models AU - Masakazu Ishihata AU - Yoshitaka Kameya AU - Taisuke Sato AU - Shin-ichi Minato BT - Proceedings of 2nd Asian Conference on Machine Learning DA - 2010/10/31 ED - Masashi Sugiyama ED - Qiang Yang ID - pmlr-v13-ishihata10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 13 SP - 161 EP - 176 L1 - http://proceedings.mlr.press/v13/ishihata10a/ishihata10a.pdf UR - https://proceedings.mlr.press/v13/ishihata10a.html AB - Logic-based probabilistic models (LBPMs) enable us to handle various problems in the real world thanks to the expressive power of logic. However, most of LBPMs have restrictions to realize efficient probability computation and learning. We propose an EM algorithm working on BDDs with order encoding for LBPMs. A notable advantage of our algorithm over existing approaches is that it copes with multi-valued random variables without restrictions. The complexity of our algorithm is proportional to the size of the BDD. In the case of hidden Markov models (HMMs), the complexity is the same as that specialized for HMMs. As an example to eliminate restrictions of existing approaches, we utilize our algorithm to give diagnoses for failure in a logic circuit involving stochastic error gates. ER -
APA
Ishihata, M., Kameya, Y., Sato, T. & Minato, S.. (2010). An EM Algorithm on BDDs with Order Encoding for Logic-based Probabilistic Models. Proceedings of 2nd Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 13:161-176 Available from https://proceedings.mlr.press/v13/ishihata10a.html.

Related Material