Bayesian Networks: a Combined Tuning Heuristic

Janneke H. Bolt
Proceedings of the Eighth International Conference on Probabilistic Graphical Models, PMLR 52:37-49, 2016.

Abstract

One of the issues in tuning an output probability of a Bayesian network by changing multiple parameters is the relative amount of the individual parameter changes. In an existing heuristic parameters are tied such that their changes induce locally a maximal change of the tuned probability. This heuristic, however, may reduce the attainable values of the tuned probability considerably. In another existing heuristic parameters are tied such that they simultaneously change in the entire interval ⟨0,1⟩. The tuning range of this heuristic will in general be larger then the tuning range of the locally optimal heuristic. Disadvantage, however, is that knowledge of the local optimal change is not exploited. In this paper a heuristic is proposed that is locally optimal, yet covers the larger tuning range of the second heuristic. Preliminary experiments show that this heuristic is a promising alternative.

Cite this Paper


BibTeX
@InProceedings{pmlr-v52-bolt16, title = {{B}ayesian Networks: a Combined Tuning Heuristic}, author = {Bolt, Janneke H.}, booktitle = {Proceedings of the Eighth International Conference on Probabilistic Graphical Models}, pages = {37--49}, year = {2016}, editor = {Antonucci, Alessandro and Corani, Giorgio and Campos}, Cassio Polpo}, volume = {52}, series = {Proceedings of Machine Learning Research}, address = {Lugano, Switzerland}, month = {06--09 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v52/bolt16.pdf}, url = {https://proceedings.mlr.press/v52/bolt16.html}, abstract = {One of the issues in tuning an output probability of a Bayesian network by changing multiple parameters is the relative amount of the individual parameter changes. In an existing heuristic parameters are tied such that their changes induce locally a maximal change of the tuned probability. This heuristic, however, may reduce the attainable values of the tuned probability considerably. In another existing heuristic parameters are tied such that they simultaneously change in the entire interval ⟨0,1⟩. The tuning range of this heuristic will in general be larger then the tuning range of the locally optimal heuristic. Disadvantage, however, is that knowledge of the local optimal change is not exploited. In this paper a heuristic is proposed that is locally optimal, yet covers the larger tuning range of the second heuristic. Preliminary experiments show that this heuristic is a promising alternative.} }
Endnote
%0 Conference Paper %T Bayesian Networks: a Combined Tuning Heuristic %A Janneke H. Bolt %B Proceedings of the Eighth International Conference on Probabilistic Graphical Models %C Proceedings of Machine Learning Research %D 2016 %E Alessandro Antonucci %E Giorgio Corani %E Cassio Polpo Campos} %F pmlr-v52-bolt16 %I PMLR %P 37--49 %U https://proceedings.mlr.press/v52/bolt16.html %V 52 %X One of the issues in tuning an output probability of a Bayesian network by changing multiple parameters is the relative amount of the individual parameter changes. In an existing heuristic parameters are tied such that their changes induce locally a maximal change of the tuned probability. This heuristic, however, may reduce the attainable values of the tuned probability considerably. In another existing heuristic parameters are tied such that they simultaneously change in the entire interval ⟨0,1⟩. The tuning range of this heuristic will in general be larger then the tuning range of the locally optimal heuristic. Disadvantage, however, is that knowledge of the local optimal change is not exploited. In this paper a heuristic is proposed that is locally optimal, yet covers the larger tuning range of the second heuristic. Preliminary experiments show that this heuristic is a promising alternative.
RIS
TY - CPAPER TI - Bayesian Networks: a Combined Tuning Heuristic AU - Janneke H. Bolt BT - Proceedings of the Eighth International Conference on Probabilistic Graphical Models DA - 2016/08/15 ED - Alessandro Antonucci ED - Giorgio Corani ED - Cassio Polpo Campos} ID - pmlr-v52-bolt16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 52 SP - 37 EP - 49 L1 - http://proceedings.mlr.press/v52/bolt16.pdf UR - https://proceedings.mlr.press/v52/bolt16.html AB - One of the issues in tuning an output probability of a Bayesian network by changing multiple parameters is the relative amount of the individual parameter changes. In an existing heuristic parameters are tied such that their changes induce locally a maximal change of the tuned probability. This heuristic, however, may reduce the attainable values of the tuned probability considerably. In another existing heuristic parameters are tied such that they simultaneously change in the entire interval ⟨0,1⟩. The tuning range of this heuristic will in general be larger then the tuning range of the locally optimal heuristic. Disadvantage, however, is that knowledge of the local optimal change is not exploited. In this paper a heuristic is proposed that is locally optimal, yet covers the larger tuning range of the second heuristic. Preliminary experiments show that this heuristic is a promising alternative. ER -
APA
Bolt, J.H.. (2016). Bayesian Networks: a Combined Tuning Heuristic. Proceedings of the Eighth International Conference on Probabilistic Graphical Models, in Proceedings of Machine Learning Research 52:37-49 Available from https://proceedings.mlr.press/v52/bolt16.html.

Related Material