[edit]
SDM-Net: A simple and effective model for generalized zero-shot learning
Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, PMLR 161:2103-2113, 2021.
Abstract
Zero-Shot Learning (ZSL) is a classification task where some classes referred to as unseen classes have no training images. Instead, we only have side information about seen and unseen classes, often in the form of semantic or descriptive attributes. Lack of training images from a set of classes restricts the use of standard classification techniques and losses, including the widespread cross-entropy loss. We introduce a novel Similarity Distribution Matching Network (SDM-Net) which is a standard fully connected neural network architecture with a non-trainable penultimate layer consisting of class attributes. The output layer of SDM-Net consists of both seen and unseen classes. To enable zero-shot learning, during training, we regularize the model such that the predicted distribution of unseen class is close in KL divergence to the distribution of similarities between the correct seen class and all the unseen classes. We evaluate the proposed model on five benchmark datasets for zero-shot learning, AwA1, AwA2, aPY, SUN, and CUB datasets. We show that, despite the simplicity, our approach achieves competitive performance with state-of-the-art methods in Generalized-ZSL setting for all of these datasets.