Evaluating high-order predictive distributions in deep learning

Ian Osband, Zheng Wen, Seyed Mohammad Asghari, Vikranth Dwaracherla, Xiuyuan Lu, Benjamin Van Roy
Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, PMLR 180:1552-1560, 2022.

Abstract

Most work on supervised learning research has focused on marginal predictions. In decision problems, joint predictive distributions are essential for good performance. Previous work has developed methods for assessing low-order predictive distributions with inputs sampled i.i.d. from the testing distribution. With low-dimensional inputs, these methods distinguish agents that effectively estimate uncertainty from those that do not. We establish that the predictive distribution order required for such differentiation increases greatly with input dimension, rendering these methods impractical. To accommodate high-dimensional inputs, we introduce dyadic sampling, which focuses on predictive distributions associated with random pairs of inputs. We demonstrate that this approach efficiently distinguishes agents in high-dimensional examples involving simple logistic regression as well as complex synthetic and empirical data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v180-osband22a, title = {Evaluating high-order predictive distributions in deep learning}, author = {Osband, Ian and Wen, Zheng and Asghari, Seyed Mohammad and Dwaracherla, Vikranth and Lu, Xiuyuan and Van Roy, Benjamin}, booktitle = {Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence}, pages = {1552--1560}, year = {2022}, editor = {Cussens, James and Zhang, Kun}, volume = {180}, series = {Proceedings of Machine Learning Research}, month = {01--05 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v180/osband22a/osband22a.pdf}, url = {https://proceedings.mlr.press/v180/osband22a.html}, abstract = {Most work on supervised learning research has focused on marginal predictions. In decision problems, joint predictive distributions are essential for good performance. Previous work has developed methods for assessing low-order predictive distributions with inputs sampled i.i.d. from the testing distribution. With low-dimensional inputs, these methods distinguish agents that effectively estimate uncertainty from those that do not. We establish that the predictive distribution order required for such differentiation increases greatly with input dimension, rendering these methods impractical. To accommodate high-dimensional inputs, we introduce dyadic sampling, which focuses on predictive distributions associated with random pairs of inputs. We demonstrate that this approach efficiently distinguishes agents in high-dimensional examples involving simple logistic regression as well as complex synthetic and empirical data.} }
Endnote
%0 Conference Paper %T Evaluating high-order predictive distributions in deep learning %A Ian Osband %A Zheng Wen %A Seyed Mohammad Asghari %A Vikranth Dwaracherla %A Xiuyuan Lu %A Benjamin Van Roy %B Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2022 %E James Cussens %E Kun Zhang %F pmlr-v180-osband22a %I PMLR %P 1552--1560 %U https://proceedings.mlr.press/v180/osband22a.html %V 180 %X Most work on supervised learning research has focused on marginal predictions. In decision problems, joint predictive distributions are essential for good performance. Previous work has developed methods for assessing low-order predictive distributions with inputs sampled i.i.d. from the testing distribution. With low-dimensional inputs, these methods distinguish agents that effectively estimate uncertainty from those that do not. We establish that the predictive distribution order required for such differentiation increases greatly with input dimension, rendering these methods impractical. To accommodate high-dimensional inputs, we introduce dyadic sampling, which focuses on predictive distributions associated with random pairs of inputs. We demonstrate that this approach efficiently distinguishes agents in high-dimensional examples involving simple logistic regression as well as complex synthetic and empirical data.
APA
Osband, I., Wen, Z., Asghari, S.M., Dwaracherla, V., Lu, X. & Van Roy, B.. (2022). Evaluating high-order predictive distributions in deep learning. Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 180:1552-1560 Available from https://proceedings.mlr.press/v180/osband22a.html.

Related Material