Efficient Computation of Belief Theoretic Conditionals

Lalintha G. Polpitiya, Kamal Premaratne, Manohar N. Murthi, Dilip Sarkar
Proceedings of the Tenth International Symposium on Imprecise Probability: Theories and Applications, PMLR 62:265-276, 2017.

Abstract

Dempster-Shafer (DS) belief theory is a powerful general framework for dealing with a wider variety of uncertainties in data. As in Bayesian probability theory, the conditional operation plays a critical role in DS theoretic strategies for evidence updating and fusion. A major limitation associated with the application of DS theoretic techniques for reasoning under uncertainty is the absence of a feasible computational framework to overcome the prohibitive computational burden this conditional operation entails. This paper addresses this critical challenge via a novel generalized conditional computational model — DS-Conditional-One — which allows the conditional to be computed in significantly less computational and space complexity. This computational model also provides valuable insight into the DS theoretic conditional itself and can be utilized as a tool for visualizing the conditional computation. We provide a thorough analysis and experimental validation of the utility, efficiency, and implementation of the proposed data structures and algorithms for carrying out both the Dempster’s conditional and Fagin-Halpern conditional, the two most widely utilized DS theoretic conditional strategies.

Cite this Paper


BibTeX
@InProceedings{pmlr-v62-polpitiya17a, title = {Efficient Computation of Belief Theoretic Conditionals}, author = {Polpitiya, Lalintha G. and Premaratne, Kamal and Murthi, Manohar N. and Sarkar, Dilip}, booktitle = {Proceedings of the Tenth International Symposium on Imprecise Probability: Theories and Applications}, pages = {265--276}, year = {2017}, editor = {Antonucci, Alessandro and Corani, Giorgio and Couso, Inés and Destercke, Sébastien}, volume = {62}, series = {Proceedings of Machine Learning Research}, month = {10--14 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v62/polpitiya17a/polpitiya17a.pdf}, url = {https://proceedings.mlr.press/v62/polpitiya17a.html}, abstract = {Dempster-Shafer (DS) belief theory is a powerful general framework for dealing with a wider variety of uncertainties in data. As in Bayesian probability theory, the conditional operation plays a critical role in DS theoretic strategies for evidence updating and fusion. A major limitation associated with the application of DS theoretic techniques for reasoning under uncertainty is the absence of a feasible computational framework to overcome the prohibitive computational burden this conditional operation entails. This paper addresses this critical challenge via a novel generalized conditional computational model — DS-Conditional-One — which allows the conditional to be computed in significantly less computational and space complexity. This computational model also provides valuable insight into the DS theoretic conditional itself and can be utilized as a tool for visualizing the conditional computation. We provide a thorough analysis and experimental validation of the utility, efficiency, and implementation of the proposed data structures and algorithms for carrying out both the Dempster’s conditional and Fagin-Halpern conditional, the two most widely utilized DS theoretic conditional strategies.} }
Endnote
%0 Conference Paper %T Efficient Computation of Belief Theoretic Conditionals %A Lalintha G. Polpitiya %A Kamal Premaratne %A Manohar N. Murthi %A Dilip Sarkar %B Proceedings of the Tenth International Symposium on Imprecise Probability: Theories and Applications %C Proceedings of Machine Learning Research %D 2017 %E Alessandro Antonucci %E Giorgio Corani %E Inés Couso %E Sébastien Destercke %F pmlr-v62-polpitiya17a %I PMLR %P 265--276 %U https://proceedings.mlr.press/v62/polpitiya17a.html %V 62 %X Dempster-Shafer (DS) belief theory is a powerful general framework for dealing with a wider variety of uncertainties in data. As in Bayesian probability theory, the conditional operation plays a critical role in DS theoretic strategies for evidence updating and fusion. A major limitation associated with the application of DS theoretic techniques for reasoning under uncertainty is the absence of a feasible computational framework to overcome the prohibitive computational burden this conditional operation entails. This paper addresses this critical challenge via a novel generalized conditional computational model — DS-Conditional-One — which allows the conditional to be computed in significantly less computational and space complexity. This computational model also provides valuable insight into the DS theoretic conditional itself and can be utilized as a tool for visualizing the conditional computation. We provide a thorough analysis and experimental validation of the utility, efficiency, and implementation of the proposed data structures and algorithms for carrying out both the Dempster’s conditional and Fagin-Halpern conditional, the two most widely utilized DS theoretic conditional strategies.
APA
Polpitiya, L.G., Premaratne, K., Murthi, M.N. & Sarkar, D.. (2017). Efficient Computation of Belief Theoretic Conditionals. Proceedings of the Tenth International Symposium on Imprecise Probability: Theories and Applications, in Proceedings of Machine Learning Research 62:265-276 Available from https://proceedings.mlr.press/v62/polpitiya17a.html.

Related Material