Adversarial Discrete Sequence Generation without Explicit NeuralNetworks as Discriminators
[edit]
Proceedings of Machine Learning Research, PMLR 89:30893098, 2019.
Abstract
This paper presents a novel approach to train GANs for discrete sequence generation without resorting to an explicit neural network as the discriminator. We show that when an alternative minimax optimization procedure is performed for the value function where a closed form solution for the discriminator exists in the maximization step, it is equivalent to directly optimizing the JensonShannon divergence (JSD) between the generator’s distribution and the empirical distribution over the training data without sampling from the generator, thus optimizing the JSD becomes computationally tractable to train the generator that generates sequences of discrete data. Extensive experiments on synthetic data and realworld tasks demonstrate significant improvements over existing methods to train GANs that generate discrete sequences.
Related Material


