Inference using Probabilistic Concept Trees

Doug Fisher, Doug Talbert
Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, PMLR R1:191-202, 1997.

Abstract

Discussions of ’probabilistic reasoning systems’ often presuppose a belief network, which represents the joint probability distribution of a domain, as the primary knowledge structure. However, another common knowledge structure from which the joint probability distribution can be recovered is a hierarchical probabilistic clustering or probabilistic concept tree (Fisher, 1987). Probabilistic concept trees are a target structure for a number of clustering systems from machine learning such as COBWEB (Fisher, 1987) and systems by Hadzikadik and Yun (1989), Gennari, Langley, and Fisher (1989), Decaestecker (1991), Anderson and Matessa (1991), Reich and Fenves (1991), Biswas, Weinberg, and Li (1994), De Alte Da Veiga (1994), Kilander (1994) Ketterlin, Gan{\c}arski, and Korczak (1995), and Nevins (1995). Related probabilistic structures are produced by systems such as AUTOCLASS (Cheeseman, Kelly, Self, Stutz, Taylor, &Freeman, 1988), SNOB (Wallace &Boulton, 1968; Wallace & Dowe, 1994) , and systems by Hanson and Bauer (1989) and Martin and Billman (1994). These systems can be easily adapted to form probabilistic concept trees of the type we describe. This paper will not focus on clustering systems \emph{per se}, but on characteristics and capabilities of probabilistic concept trees, particularly as they relate to inference tasks often associated with belief networks. As ’object-centered’ knowledge structures, probabilistic concept trees nicely complement the ’variable-centered’, belief network structure.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR1-fisher97a, title = {Inference using Probabilistic Concept Trees}, author = {Fisher, Doug and Talbert, Doug}, booktitle = {Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics}, pages = {191--202}, year = {1997}, editor = {Madigan, David and Smyth, Padhraic}, volume = {R1}, series = {Proceedings of Machine Learning Research}, month = {04--07 Jan}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/r1/fisher97a/fisher97a.pdf}, url = {https://proceedings.mlr.press/r1/fisher97a.html}, abstract = {Discussions of ’probabilistic reasoning systems’ often presuppose a belief network, which represents the joint probability distribution of a domain, as the primary knowledge structure. However, another common knowledge structure from which the joint probability distribution can be recovered is a hierarchical probabilistic clustering or probabilistic concept tree (Fisher, 1987). Probabilistic concept trees are a target structure for a number of clustering systems from machine learning such as COBWEB (Fisher, 1987) and systems by Hadzikadik and Yun (1989), Gennari, Langley, and Fisher (1989), Decaestecker (1991), Anderson and Matessa (1991), Reich and Fenves (1991), Biswas, Weinberg, and Li (1994), De Alte Da Veiga (1994), Kilander (1994) Ketterlin, Gan{\c}arski, and Korczak (1995), and Nevins (1995). Related probabilistic structures are produced by systems such as AUTOCLASS (Cheeseman, Kelly, Self, Stutz, Taylor, &Freeman, 1988), SNOB (Wallace &Boulton, 1968; Wallace & Dowe, 1994) , and systems by Hanson and Bauer (1989) and Martin and Billman (1994). These systems can be easily adapted to form probabilistic concept trees of the type we describe. This paper will not focus on clustering systems \emph{per se}, but on characteristics and capabilities of probabilistic concept trees, particularly as they relate to inference tasks often associated with belief networks. As ’object-centered’ knowledge structures, probabilistic concept trees nicely complement the ’variable-centered’, belief network structure.}, note = {Reissued by PMLR on 30 March 2021.} }
Endnote
%0 Conference Paper %T Inference using Probabilistic Concept Trees %A Doug Fisher %A Doug Talbert %B Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 1997 %E David Madigan %E Padhraic Smyth %F pmlr-vR1-fisher97a %I PMLR %P 191--202 %U https://proceedings.mlr.press/r1/fisher97a.html %V R1 %X Discussions of ’probabilistic reasoning systems’ often presuppose a belief network, which represents the joint probability distribution of a domain, as the primary knowledge structure. However, another common knowledge structure from which the joint probability distribution can be recovered is a hierarchical probabilistic clustering or probabilistic concept tree (Fisher, 1987). Probabilistic concept trees are a target structure for a number of clustering systems from machine learning such as COBWEB (Fisher, 1987) and systems by Hadzikadik and Yun (1989), Gennari, Langley, and Fisher (1989), Decaestecker (1991), Anderson and Matessa (1991), Reich and Fenves (1991), Biswas, Weinberg, and Li (1994), De Alte Da Veiga (1994), Kilander (1994) Ketterlin, Gan{\c}arski, and Korczak (1995), and Nevins (1995). Related probabilistic structures are produced by systems such as AUTOCLASS (Cheeseman, Kelly, Self, Stutz, Taylor, &Freeman, 1988), SNOB (Wallace &Boulton, 1968; Wallace & Dowe, 1994) , and systems by Hanson and Bauer (1989) and Martin and Billman (1994). These systems can be easily adapted to form probabilistic concept trees of the type we describe. This paper will not focus on clustering systems \emph{per se}, but on characteristics and capabilities of probabilistic concept trees, particularly as they relate to inference tasks often associated with belief networks. As ’object-centered’ knowledge structures, probabilistic concept trees nicely complement the ’variable-centered’, belief network structure. %Z Reissued by PMLR on 30 March 2021.
APA
Fisher, D. & Talbert, D.. (1997). Inference using Probabilistic Concept Trees. Proceedings of the Sixth International Workshop on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research R1:191-202 Available from https://proceedings.mlr.press/r1/fisher97a.html. Reissued by PMLR on 30 March 2021.

Related Material