On the optimality of kernels for high-dimensional clustering

Leena C Vankadara, Debarghya Ghoshdastidar
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2185-2195, 2020.

Abstract

This paper studies the optimality of kernel methods in high dimensional data clustering. Recent works have studied the large sample performance of kernel clustering in the high dimensional regime, where Euclidean distance becomes less informative. However, it is unknown whether popular methods, such as kernel k-means, are optimal in this regime. We consider the problem of high dimensional Gaussian clustering and show that, with the exponential kernel function, the sufficient conditions for partial recovery of clusters using the NP-hard kernel k-means objective matches the known information-theoretic limit up to a factor of $\sqrt{2}$. It also exactly matches the known upper bounds for the non-kernel setting. We also show that a semi-definite relaxation of the kernel k-means procedure matches up to constant factors, the spectral threshold, below which no polynomial-time algorithm is known to succeed. This is the first work that provides such optimality guarantees for the kernel k-means as well as its convex relaxation. Our proofs demonstrate the utility of the less known polynomial concentration results for random variables with exponentially decaying tails in the higher-order analysis of kernel methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-vankadara20a, title = {On the optimality of kernels for high-dimensional clustering}, author = {Vankadara, Leena C and Ghoshdastidar, Debarghya}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {2185--2195}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/vankadara20a/vankadara20a.pdf}, url = {https://proceedings.mlr.press/v108/vankadara20a.html}, abstract = { This paper studies the optimality of kernel methods in high dimensional data clustering. Recent works have studied the large sample performance of kernel clustering in the high dimensional regime, where Euclidean distance becomes less informative. However, it is unknown whether popular methods, such as kernel k-means, are optimal in this regime. We consider the problem of high dimensional Gaussian clustering and show that, with the exponential kernel function, the sufficient conditions for partial recovery of clusters using the NP-hard kernel k-means objective matches the known information-theoretic limit up to a factor of $\sqrt{2}$. It also exactly matches the known upper bounds for the non-kernel setting. We also show that a semi-definite relaxation of the kernel k-means procedure matches up to constant factors, the spectral threshold, below which no polynomial-time algorithm is known to succeed. This is the first work that provides such optimality guarantees for the kernel k-means as well as its convex relaxation. Our proofs demonstrate the utility of the less known polynomial concentration results for random variables with exponentially decaying tails in the higher-order analysis of kernel methods.} }
Endnote
%0 Conference Paper %T On the optimality of kernels for high-dimensional clustering %A Leena C Vankadara %A Debarghya Ghoshdastidar %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-vankadara20a %I PMLR %P 2185--2195 %U https://proceedings.mlr.press/v108/vankadara20a.html %V 108 %X This paper studies the optimality of kernel methods in high dimensional data clustering. Recent works have studied the large sample performance of kernel clustering in the high dimensional regime, where Euclidean distance becomes less informative. However, it is unknown whether popular methods, such as kernel k-means, are optimal in this regime. We consider the problem of high dimensional Gaussian clustering and show that, with the exponential kernel function, the sufficient conditions for partial recovery of clusters using the NP-hard kernel k-means objective matches the known information-theoretic limit up to a factor of $\sqrt{2}$. It also exactly matches the known upper bounds for the non-kernel setting. We also show that a semi-definite relaxation of the kernel k-means procedure matches up to constant factors, the spectral threshold, below which no polynomial-time algorithm is known to succeed. This is the first work that provides such optimality guarantees for the kernel k-means as well as its convex relaxation. Our proofs demonstrate the utility of the less known polynomial concentration results for random variables with exponentially decaying tails in the higher-order analysis of kernel methods.
APA
Vankadara, L.C. & Ghoshdastidar, D.. (2020). On the optimality of kernels for high-dimensional clustering. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:2185-2195 Available from https://proceedings.mlr.press/v108/vankadara20a.html.

Related Material