[edit]
Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:3795-3824, 2023.
Abstract
Convolutional deep sets is a neural network architecture that can model stationary stochastic processes. This architecture uses the kernel smoother and deep convolutional neural network to construct translation equivariant functional representations. However, the non-parametric nature of the kernel smoother can produce ambiguous representations when the number of data points is not given sufficiently. To address this issue, we introduce bayesian convolutional deep sets, which constructs random translation equivariant functional representations with a stationary prior. Furthermore, we present how to impose the task-dependent prior for each dataset because a wrongly imposed prior can result in an even worse representation than that of the kernel smoother. Empirically, we demonstrate that the proposed architecture alleviates the targeted issue in various experiments with time-series and image datasets.