Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior

Yohan Jung, Jinkyoo Park
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:3795-3824, 2023.

Abstract

Convolutional deep sets is a neural network architecture that can model stationary stochastic processes. This architecture uses the kernel smoother and deep convolutional neural network to construct translation equivariant functional representations. However, the non-parametric nature of the kernel smoother can produce ambiguous representations when the number of data points is not given sufficiently. To address this issue, we introduce bayesian convolutional deep sets, which constructs random translation equivariant functional representations with a stationary prior. Furthermore, we present how to impose the task-dependent prior for each dataset because a wrongly imposed prior can result in an even worse representation than that of the kernel smoother. Empirically, we demonstrate that the proposed architecture alleviates the targeted issue in various experiments with time-series and image datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-jung23a, title = {Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior}, author = {Jung, Yohan and Park, Jinkyoo}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {3795--3824}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/jung23a/jung23a.pdf}, url = {https://proceedings.mlr.press/v206/jung23a.html}, abstract = {Convolutional deep sets is a neural network architecture that can model stationary stochastic processes. This architecture uses the kernel smoother and deep convolutional neural network to construct translation equivariant functional representations. However, the non-parametric nature of the kernel smoother can produce ambiguous representations when the number of data points is not given sufficiently. To address this issue, we introduce bayesian convolutional deep sets, which constructs random translation equivariant functional representations with a stationary prior. Furthermore, we present how to impose the task-dependent prior for each dataset because a wrongly imposed prior can result in an even worse representation than that of the kernel smoother. Empirically, we demonstrate that the proposed architecture alleviates the targeted issue in various experiments with time-series and image datasets.} }
Endnote
%0 Conference Paper %T Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior %A Yohan Jung %A Jinkyoo Park %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-jung23a %I PMLR %P 3795--3824 %U https://proceedings.mlr.press/v206/jung23a.html %V 206 %X Convolutional deep sets is a neural network architecture that can model stationary stochastic processes. This architecture uses the kernel smoother and deep convolutional neural network to construct translation equivariant functional representations. However, the non-parametric nature of the kernel smoother can produce ambiguous representations when the number of data points is not given sufficiently. To address this issue, we introduce bayesian convolutional deep sets, which constructs random translation equivariant functional representations with a stationary prior. Furthermore, we present how to impose the task-dependent prior for each dataset because a wrongly imposed prior can result in an even worse representation than that of the kernel smoother. Empirically, we demonstrate that the proposed architecture alleviates the targeted issue in various experiments with time-series and image datasets.
APA
Jung, Y. & Park, J.. (2023). Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:3795-3824 Available from https://proceedings.mlr.press/v206/jung23a.html.

Related Material