Minimum Conditional Entropy Clustering: A Discriminative Framework for Clustering

Bo Dai, Baogang Hu
Proceedings of 2nd Asian Conference on Machine Learning, PMLR 13:47-62, 2010.

Abstract

In this paper, we introduce an assumption which makes it possible to extend the learning ability of discriminative model to unsupervised setting. We propose an information-theoretic framework as an implementation of the low-density separation assumption. The proposed framework provides a unified perspective of Maximum Margin Clustering (MMC), Discriminative $k$-means, Spectral Clustering and Unsupervised Renyi's Entropy Analysis and also leads to a novel and efficient algorithm, Accelerated Maximum Relative Margin Clustering (ARMC), which maximizes the margin while considering the spread of projections and affine invariance. Experimental results show that the proposed discriminative unsupervised learning method is more efficient in utilizing data and achieves the state-of-the-art or even better performance compared with mainstream clustering methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v13-dai10a, title = {Minimum Conditional Entropy Clustering: A Discriminative Framework for Clustering}, author = {Dai, Bo and Hu, Baogang}, booktitle = {Proceedings of 2nd Asian Conference on Machine Learning}, pages = {47--62}, year = {2010}, editor = {Sugiyama, Masashi and Yang, Qiang}, volume = {13}, series = {Proceedings of Machine Learning Research}, address = {Tokyo, Japan}, month = {08--10 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v13/dai10a/dai10a.pdf}, url = {https://proceedings.mlr.press/v13/dai10a.html}, abstract = {In this paper, we introduce an assumption which makes it possible to extend the learning ability of discriminative model to unsupervised setting. We propose an information-theoretic framework as an implementation of the low-density separation assumption. The proposed framework provides a unified perspective of Maximum Margin Clustering (MMC), Discriminative $k$-means, Spectral Clustering and Unsupervised Renyi's Entropy Analysis and also leads to a novel and efficient algorithm, Accelerated Maximum Relative Margin Clustering (ARMC), which maximizes the margin while considering the spread of projections and affine invariance. Experimental results show that the proposed discriminative unsupervised learning method is more efficient in utilizing data and achieves the state-of-the-art or even better performance compared with mainstream clustering methods.} }
Endnote
%0 Conference Paper %T Minimum Conditional Entropy Clustering: A Discriminative Framework for Clustering %A Bo Dai %A Baogang Hu %B Proceedings of 2nd Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2010 %E Masashi Sugiyama %E Qiang Yang %F pmlr-v13-dai10a %I PMLR %P 47--62 %U https://proceedings.mlr.press/v13/dai10a.html %V 13 %X In this paper, we introduce an assumption which makes it possible to extend the learning ability of discriminative model to unsupervised setting. We propose an information-theoretic framework as an implementation of the low-density separation assumption. The proposed framework provides a unified perspective of Maximum Margin Clustering (MMC), Discriminative $k$-means, Spectral Clustering and Unsupervised Renyi's Entropy Analysis and also leads to a novel and efficient algorithm, Accelerated Maximum Relative Margin Clustering (ARMC), which maximizes the margin while considering the spread of projections and affine invariance. Experimental results show that the proposed discriminative unsupervised learning method is more efficient in utilizing data and achieves the state-of-the-art or even better performance compared with mainstream clustering methods.
RIS
TY - CPAPER TI - Minimum Conditional Entropy Clustering: A Discriminative Framework for Clustering AU - Bo Dai AU - Baogang Hu BT - Proceedings of 2nd Asian Conference on Machine Learning DA - 2010/10/31 ED - Masashi Sugiyama ED - Qiang Yang ID - pmlr-v13-dai10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 13 SP - 47 EP - 62 L1 - http://proceedings.mlr.press/v13/dai10a/dai10a.pdf UR - https://proceedings.mlr.press/v13/dai10a.html AB - In this paper, we introduce an assumption which makes it possible to extend the learning ability of discriminative model to unsupervised setting. We propose an information-theoretic framework as an implementation of the low-density separation assumption. The proposed framework provides a unified perspective of Maximum Margin Clustering (MMC), Discriminative $k$-means, Spectral Clustering and Unsupervised Renyi's Entropy Analysis and also leads to a novel and efficient algorithm, Accelerated Maximum Relative Margin Clustering (ARMC), which maximizes the margin while considering the spread of projections and affine invariance. Experimental results show that the proposed discriminative unsupervised learning method is more efficient in utilizing data and achieves the state-of-the-art or even better performance compared with mainstream clustering methods. ER -
APA
Dai, B. & Hu, B.. (2010). Minimum Conditional Entropy Clustering: A Discriminative Framework for Clustering. Proceedings of 2nd Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 13:47-62 Available from https://proceedings.mlr.press/v13/dai10a.html.

Related Material