Deep Spectral Clustering Learning
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:19851994, 2017.
Abstract
Clustering is the task of grouping a set of examples so that similar examples are grouped into the same cluster while dissimilar examples are in different clusters. The quality of a clustering depends on two problemdependent factors which are i) the chosen similarity metric and ii) the data representation. Supervised clustering approaches, which exploit labeled partitioned datasets have thus been proposed, for instance to learn a metric optimized to perform clustering. However, most of these approaches assume that the representation of the data is fixed and then learn an appropriate linear transformation. Some deep supervised clustering learning approaches have also been proposed. However, they rely on iterative methods to compute gradients resulting in high algorithmic complexity. In this paper, we propose a deep supervised clustering metric learning method that formulates a novel loss function. We derive a closedform expression for the gradient that is efficient to compute: the complexity to compute the gradient is linear in the size of the training minibatch and quadratic in the representation dimensionality. We further reveal how our approach can be seen as learning spectral clustering. Experiments on standard realworld datasets confirm stateoftheart Recall@K performance.
Related Material


