Flexible Prior Elicitation via the Prior Predictive Distribution

Marcelo Hartmann, Georgi Agiashvili, Paul Bürkner, Arto Klami
Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), PMLR 124:1129-1138, 2020.

Abstract

The prior distribution for the unknown model parameters plays a crucial role in the process of statistical inference based on Bayesian methods. However, specifying suitable priors is often difficult even when detailed prior knowledge is available in principle. The challenge is to express quantitative information in the form of a probability distribution. Prior elicitation addresses this question by extracting subjective information from an expert and transforming it into a valid prior. Most existing methods, however, require information to be provided on the unobservable parameters, whose effect on the data generating process is often complicated and hard to understand. We propose an alternative approach that only requires knowledge about the observable outcomes - knowledge which is often much easier for experts to provide. Building upon a principled statistical framework, our approach utilizes the prior predictive distribution implied by the model to automatically transform experts judgements about plausible outcome values to suitable priors on the parameters. We also provide computational strategies to perform inference and guidelines to facilitate practical use.

Cite this Paper


BibTeX
@InProceedings{pmlr-v124-hartmann20a, title = {Flexible Prior Elicitation via the Prior Predictive Distribution}, author = {Hartmann, Marcelo and Agiashvili, Georgi and B\"{u}rkner, Paul and Klami, Arto}, booktitle = {Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI)}, pages = {1129--1138}, year = {2020}, editor = {Peters, Jonas and Sontag, David}, volume = {124}, series = {Proceedings of Machine Learning Research}, month = {03--06 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v124/hartmann20a/hartmann20a.pdf}, url = {https://proceedings.mlr.press/v124/hartmann20a.html}, abstract = { The prior distribution for the unknown model parameters plays a crucial role in the process of statistical inference based on Bayesian methods. However, specifying suitable priors is often difficult even when detailed prior knowledge is available in principle. The challenge is to express quantitative information in the form of a probability distribution. Prior elicitation addresses this question by extracting subjective information from an expert and transforming it into a valid prior. Most existing methods, however, require information to be provided on the unobservable parameters, whose effect on the data generating process is often complicated and hard to understand. We propose an alternative approach that only requires knowledge about the observable outcomes - knowledge which is often much easier for experts to provide. Building upon a principled statistical framework, our approach utilizes the prior predictive distribution implied by the model to automatically transform experts judgements about plausible outcome values to suitable priors on the parameters. We also provide computational strategies to perform inference and guidelines to facilitate practical use.} }
Endnote
%0 Conference Paper %T Flexible Prior Elicitation via the Prior Predictive Distribution %A Marcelo Hartmann %A Georgi Agiashvili %A Paul Bürkner %A Arto Klami %B Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI) %C Proceedings of Machine Learning Research %D 2020 %E Jonas Peters %E David Sontag %F pmlr-v124-hartmann20a %I PMLR %P 1129--1138 %U https://proceedings.mlr.press/v124/hartmann20a.html %V 124 %X The prior distribution for the unknown model parameters plays a crucial role in the process of statistical inference based on Bayesian methods. However, specifying suitable priors is often difficult even when detailed prior knowledge is available in principle. The challenge is to express quantitative information in the form of a probability distribution. Prior elicitation addresses this question by extracting subjective information from an expert and transforming it into a valid prior. Most existing methods, however, require information to be provided on the unobservable parameters, whose effect on the data generating process is often complicated and hard to understand. We propose an alternative approach that only requires knowledge about the observable outcomes - knowledge which is often much easier for experts to provide. Building upon a principled statistical framework, our approach utilizes the prior predictive distribution implied by the model to automatically transform experts judgements about plausible outcome values to suitable priors on the parameters. We also provide computational strategies to perform inference and guidelines to facilitate practical use.
APA
Hartmann, M., Agiashvili, G., Bürkner, P. & Klami, A.. (2020). Flexible Prior Elicitation via the Prior Predictive Distribution. Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), in Proceedings of Machine Learning Research 124:1129-1138 Available from https://proceedings.mlr.press/v124/hartmann20a.html.

Related Material