Deep Switch Networks for Generating Discrete Data and Language
[edit]
Proceedings of Machine Learning Research, PMLR 89:30603069, 2019.
Abstract
Multilayer switch networks are proposed as artificial generators of highdimensional discrete data (e.g., binary vectors, categorical data, natural language, network log files, and discretevalued time series). Unlike deconvolution networks which generate continuousvalued data and which consist of upsampling filters and reverse pooling layers, multilayer switch networks are composed of adaptive switches which model conditional distributions of discrete random variables. An interpretable, statistical framework is introduced for training these nonlinear networks based on a maximumlikelihood objective function. To learn network parameters, stochastic gradient descent is applied to the objective, and is stable until convergence. This direct optimization does not involve backpropagation over separate encoder and decoder networks, or adversarial training of dueling networks. While training remains tractable for moderately sized networks, Markovchain Monte Carlo (MCMC) approximations of gradients are derived for deep networks which contain latent variables. The statistical framework is evaluated on synthetic data, highdimensional binary data of handwritten digits, and webcrawled natural language data. Aspects of the model’s framework such as interpretability, computational complexity, and generalization ability are discussed.
Related Material


