Distance-to-Set Priors and Constrained Bayesian Inference

Rick Presman, Jason Xu
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:2310-2326, 2023.

Abstract

Constrained learning is prevalent in many statistical tasks. Recent work proposes distance-to-set penalties to derive estimators under general constraints that can be specified as sets, but focuses on obtaining point estimates that do not come with corresponding measures of uncertainty. To remedy this, we approach distance-to-set regularization from a Bayesian lens. We consider a class of smooth distance-to-set priors, showing that they yield well-defined posteriors toward quantifying uncertainty for constrained learning problems. We discuss relationships and advantages over prior work on Bayesian constraint relaxation. Moreover, we prove that our approach is optimal in an information geometric-sense for finite penalty parameters $\rho$, and enjoys favorable statistical properties when $\rho \rightarrow \infty$. The method is designed to perform effectively within gradient based MCMC samplers, as illustrated on a suite of simulated and real data applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-presman23a, title = {Distance-to-Set Priors and Constrained Bayesian Inference}, author = {Presman, Rick and Xu, Jason}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {2310--2326}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/presman23a/presman23a.pdf}, url = {https://proceedings.mlr.press/v206/presman23a.html}, abstract = {Constrained learning is prevalent in many statistical tasks. Recent work proposes distance-to-set penalties to derive estimators under general constraints that can be specified as sets, but focuses on obtaining point estimates that do not come with corresponding measures of uncertainty. To remedy this, we approach distance-to-set regularization from a Bayesian lens. We consider a class of smooth distance-to-set priors, showing that they yield well-defined posteriors toward quantifying uncertainty for constrained learning problems. We discuss relationships and advantages over prior work on Bayesian constraint relaxation. Moreover, we prove that our approach is optimal in an information geometric-sense for finite penalty parameters $\rho$, and enjoys favorable statistical properties when $\rho \rightarrow \infty$. The method is designed to perform effectively within gradient based MCMC samplers, as illustrated on a suite of simulated and real data applications.} }
Endnote
%0 Conference Paper %T Distance-to-Set Priors and Constrained Bayesian Inference %A Rick Presman %A Jason Xu %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-presman23a %I PMLR %P 2310--2326 %U https://proceedings.mlr.press/v206/presman23a.html %V 206 %X Constrained learning is prevalent in many statistical tasks. Recent work proposes distance-to-set penalties to derive estimators under general constraints that can be specified as sets, but focuses on obtaining point estimates that do not come with corresponding measures of uncertainty. To remedy this, we approach distance-to-set regularization from a Bayesian lens. We consider a class of smooth distance-to-set priors, showing that they yield well-defined posteriors toward quantifying uncertainty for constrained learning problems. We discuss relationships and advantages over prior work on Bayesian constraint relaxation. Moreover, we prove that our approach is optimal in an information geometric-sense for finite penalty parameters $\rho$, and enjoys favorable statistical properties when $\rho \rightarrow \infty$. The method is designed to perform effectively within gradient based MCMC samplers, as illustrated on a suite of simulated and real data applications.
APA
Presman, R. & Xu, J.. (2023). Distance-to-Set Priors and Constrained Bayesian Inference. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:2310-2326 Available from https://proceedings.mlr.press/v206/presman23a.html.

Related Material