[edit]
SketchEmbedNet: Learning Novel Concepts by Imitating Drawings
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:10870-10881, 2021.
Abstract
Sketch drawings capture the salient information of visual concepts. Previous work has shown that neural networks are capable of producing sketches of natural objects drawn from a small number of classes. While earlier approaches focus on generation quality or retrieval, we explore properties of image representations learned by training a model to produce sketches of images. We show that this generative, class-agnostic model produces informative embeddings of images from novel examples, classes, and even novel datasets in a few-shot setting. Additionally, we find that these learned representations exhibit interesting structure and compositionality.