Learning Texture Manifolds with the Periodic Spatial GAN

Urs Bergmann, Nikolay Jetchev, Roland Vollgraf
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:469-477, 2017.

Abstract

This paper introduces a novel approach to texture synthesis based on generative adversarial networks (GAN) (Goodfellow et al., 2014), and call this technique Periodic Spatial GAN (PSGAN). The PSGAN has several novel abilities which surpass the current state of the art in texture synthesis. First, we can learn multiple textures, periodic or non-periodic, from datasets of one or more complex large images. Second, we show that the image generation with PSGANs has properties of a texture manifold: we can smoothly interpolate between samples in the structured noise space and generate novel samples, which lie perceptually between the textures of the original dataset. We make multiple experiments which show that PSGANs can flexibly handle diverse texture and image data sources, and the method is highly scalable and can generate output images of arbitrary large size.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-bergmann17a, title = {Learning Texture Manifolds with the Periodic Spatial {GAN}}, author = {Urs Bergmann and Nikolay Jetchev and Roland Vollgraf}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {469--477}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/bergmann17a/bergmann17a.pdf}, url = {https://proceedings.mlr.press/v70/bergmann17a.html}, abstract = {This paper introduces a novel approach to texture synthesis based on generative adversarial networks (GAN) (Goodfellow et al., 2014), and call this technique Periodic Spatial GAN (PSGAN). The PSGAN has several novel abilities which surpass the current state of the art in texture synthesis. First, we can learn multiple textures, periodic or non-periodic, from datasets of one or more complex large images. Second, we show that the image generation with PSGANs has properties of a texture manifold: we can smoothly interpolate between samples in the structured noise space and generate novel samples, which lie perceptually between the textures of the original dataset. We make multiple experiments which show that PSGANs can flexibly handle diverse texture and image data sources, and the method is highly scalable and can generate output images of arbitrary large size.} }
Endnote
%0 Conference Paper %T Learning Texture Manifolds with the Periodic Spatial GAN %A Urs Bergmann %A Nikolay Jetchev %A Roland Vollgraf %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-bergmann17a %I PMLR %P 469--477 %U https://proceedings.mlr.press/v70/bergmann17a.html %V 70 %X This paper introduces a novel approach to texture synthesis based on generative adversarial networks (GAN) (Goodfellow et al., 2014), and call this technique Periodic Spatial GAN (PSGAN). The PSGAN has several novel abilities which surpass the current state of the art in texture synthesis. First, we can learn multiple textures, periodic or non-periodic, from datasets of one or more complex large images. Second, we show that the image generation with PSGANs has properties of a texture manifold: we can smoothly interpolate between samples in the structured noise space and generate novel samples, which lie perceptually between the textures of the original dataset. We make multiple experiments which show that PSGANs can flexibly handle diverse texture and image data sources, and the method is highly scalable and can generate output images of arbitrary large size.
APA
Bergmann, U., Jetchev, N. & Vollgraf, R.. (2017). Learning Texture Manifolds with the Periodic Spatial GAN. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:469-477 Available from https://proceedings.mlr.press/v70/bergmann17a.html.

Related Material