Nonparametric Bayesian Deep Networks with Local Competition
[edit]
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:49804988, 2019.
Abstract
The aim of this work is to enable inference of deep networks that retain high accuracy for the least possible model complexity, with the latter deduced from the data during inference. To this end, we revisit deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition. In this context, our main technical innovation consists in an inferential setup that leverages solid arguments from Bayesian nonparametrics. We infer both the needed set of connections or locally competing sets of units, as well as the required floatingpoint precision for storing the network parameters. Specifically, we introduce auxiliary discrete latent variables representing which initial network components are actually needed for modeling the data at hand, and perform Bayesian inference over them by imposing appropriate stickbreaking priors. As we experimentally show using benchmark datasets, our approach yields networks with less computational footprint than the stateoftheart, and with no compromises in predictive accuracy.
Related Material


