Kernel Normalized Cut: a Theoretical Revisit
[edit]
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:62066214, 2019.
Abstract
In this paper, we study the theoretical properties of clustering based on the kernel normalized cut. Our first contribution is to derive a nonasymptotic upper bound on the expected distortion rate of the kernel normalized cut. From this result, we show that the solution of the kernel normalized cut converges to that of the populationlevel weighted kmeans clustering on a certain reproducing kernel Hilbert space (RKHS). Our second contribution is the discover of the interesting fact that the populationlevel weighted kmeans clustering in the RKHS is equivalent to the populationlevel normalized cut. Combining these results, we can see that the kernel normalized cut converges to the populationlevel normalized cut. The criterion of the populationlevel normalized cut can be considered as an indivisibility of the population distribution, and this criterion plays an important role in the theoretical analysis of spectral clustering in Schiebinger et al. (2015). We believe that our results will provide deep insights into the behavior of both normalized cut and spectral clustering.
Related Material


