Observation subset selection as local compilation of performance profiles

Yan Radovilsky, Solomon Eyal Shimony
Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, PMLR R6:460-467, 2008.

Abstract

Deciding what to sense is a crucial task, made harder by dependencies and by a non-additive utility function. We develop approximation algorithms for selecting an optimal set of measurements, under a dependency structure modeled by a tree-shaped Bayesian network (BN).Our approach is a generalization of composing anytime algorithm represented by conditional performance profiles. This is done by relaxing the input monotonicity assumption, and extending the local compilation technique to more general classes of performance profiles (PPs). We apply the extended scheme to selecting a subset of measurements for choosing a maximum expectation variable in a binary valued BN, and for minimizing the worst variance in a Gaussian BN.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR6-radovilsky08a, title = {Observation subset selection as local compilation of performance profiles}, author = {Radovilsky, Yan and Shimony, Solomon Eyal}, booktitle = {Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence}, pages = {460--467}, year = {2008}, editor = {McAllester, David A. and Myllymäki, Petri}, volume = {R6}, series = {Proceedings of Machine Learning Research}, month = {09--12 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/r6/main/assets/radovilsky08a/radovilsky08a.pdf}, url = {https://proceedings.mlr.press/r6/radovilsky08a.html}, abstract = {Deciding what to sense is a crucial task, made harder by dependencies and by a non-additive utility function. We develop approximation algorithms for selecting an optimal set of measurements, under a dependency structure modeled by a tree-shaped Bayesian network (BN).Our approach is a generalization of composing anytime algorithm represented by conditional performance profiles. This is done by relaxing the input monotonicity assumption, and extending the local compilation technique to more general classes of performance profiles (PPs). We apply the extended scheme to selecting a subset of measurements for choosing a maximum expectation variable in a binary valued BN, and for minimizing the worst variance in a Gaussian BN.}, note = {Reissued by PMLR on 09 October 2024.} }
Endnote
%0 Conference Paper %T Observation subset selection as local compilation of performance profiles %A Yan Radovilsky %A Solomon Eyal Shimony %B Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2008 %E David A. McAllester %E Petri Myllymäki %F pmlr-vR6-radovilsky08a %I PMLR %P 460--467 %U https://proceedings.mlr.press/r6/radovilsky08a.html %V R6 %X Deciding what to sense is a crucial task, made harder by dependencies and by a non-additive utility function. We develop approximation algorithms for selecting an optimal set of measurements, under a dependency structure modeled by a tree-shaped Bayesian network (BN).Our approach is a generalization of composing anytime algorithm represented by conditional performance profiles. This is done by relaxing the input monotonicity assumption, and extending the local compilation technique to more general classes of performance profiles (PPs). We apply the extended scheme to selecting a subset of measurements for choosing a maximum expectation variable in a binary valued BN, and for minimizing the worst variance in a Gaussian BN. %Z Reissued by PMLR on 09 October 2024.
APA
Radovilsky, Y. & Shimony, S.E.. (2008). Observation subset selection as local compilation of performance profiles. Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research R6:460-467 Available from https://proceedings.mlr.press/r6/radovilsky08a.html. Reissued by PMLR on 09 October 2024.

Related Material