Feature Quantization Improves GAN Training

Yang Zhao, Chunyuan Li, Ping Yu, Jianfeng Gao, Changyou Chen
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:11376-11386, 2020.

Abstract

The instability in GANs’ training has been a long-standing problem despite remarkable research efforts. We identify that instability issues stem from difficulties of performing feature matching with mini-batch statistics, due to a fragile balance between the fixed target distribution and the progressively generated distribution. In this work, we propose feature quantizatoin (FQ) for the discriminator, to embed both true and fake data samples into a shared discrete space. The quantized values of FQ are constructed as an evolving dictionary, which is consistent with feature statistics of the recent distribution history. Hence, FQ implicitly enables robust feature matching in a compact space. Our method can be easily plugged into existing GAN models, with little computational overhead in training. Extensive experimental results show that the proposed FQ-GAN can improve the FID scores of baseline methods by a large margin on a variety of tasks, including three representative GAN models on 10 benchmarks, achieving new state-of-the-art performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-zhao20d, title = {Feature Quantization Improves {GAN} Training}, author = {Zhao, Yang and Li, Chunyuan and Yu, Ping and Gao, Jianfeng and Chen, Changyou}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {11376--11386}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/zhao20d/zhao20d.pdf}, url = {https://proceedings.mlr.press/v119/zhao20d.html}, abstract = {The instability in GANs’ training has been a long-standing problem despite remarkable research efforts. We identify that instability issues stem from difficulties of performing feature matching with mini-batch statistics, due to a fragile balance between the fixed target distribution and the progressively generated distribution. In this work, we propose feature quantizatoin (FQ) for the discriminator, to embed both true and fake data samples into a shared discrete space. The quantized values of FQ are constructed as an evolving dictionary, which is consistent with feature statistics of the recent distribution history. Hence, FQ implicitly enables robust feature matching in a compact space. Our method can be easily plugged into existing GAN models, with little computational overhead in training. Extensive experimental results show that the proposed FQ-GAN can improve the FID scores of baseline methods by a large margin on a variety of tasks, including three representative GAN models on 10 benchmarks, achieving new state-of-the-art performance.} }
Endnote
%0 Conference Paper %T Feature Quantization Improves GAN Training %A Yang Zhao %A Chunyuan Li %A Ping Yu %A Jianfeng Gao %A Changyou Chen %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-zhao20d %I PMLR %P 11376--11386 %U https://proceedings.mlr.press/v119/zhao20d.html %V 119 %X The instability in GANs’ training has been a long-standing problem despite remarkable research efforts. We identify that instability issues stem from difficulties of performing feature matching with mini-batch statistics, due to a fragile balance between the fixed target distribution and the progressively generated distribution. In this work, we propose feature quantizatoin (FQ) for the discriminator, to embed both true and fake data samples into a shared discrete space. The quantized values of FQ are constructed as an evolving dictionary, which is consistent with feature statistics of the recent distribution history. Hence, FQ implicitly enables robust feature matching in a compact space. Our method can be easily plugged into existing GAN models, with little computational overhead in training. Extensive experimental results show that the proposed FQ-GAN can improve the FID scores of baseline methods by a large margin on a variety of tasks, including three representative GAN models on 10 benchmarks, achieving new state-of-the-art performance.
APA
Zhao, Y., Li, C., Yu, P., Gao, J. & Chen, C.. (2020). Feature Quantization Improves GAN Training. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:11376-11386 Available from https://proceedings.mlr.press/v119/zhao20d.html.

Related Material