Adversarial Discrete Sequence Generation without Explicit NeuralNetworks as Discriminators

[edit]

Zhongliang Li, Tian Xia, Xingyu Lou, Kaihe Xu, Shaojun Wang, Jing Xiao ;
Proceedings of Machine Learning Research, PMLR 89:3089-3098, 2019.

Abstract

This paper presents a novel approach to train GANs for discrete sequence generation without resorting to an explicit neural network as the discriminator. We show that when an alternative mini-max optimization procedure is performed for the value function where a closed form solution for the discriminator exists in the maximization step, it is equivalent to directly optimizing the Jenson-Shannon divergence (JSD) between the generator’s distribution and the empirical distribution over the training data without sampling from the generator, thus optimizing the JSD becomes computationally tractable to train the generator that generates sequences of discrete data. Extensive experiments on synthetic data and real-world tasks demonstrate significant improvements over existing methods to train GANs that generate discrete sequences.

Related Material