Beyond Hartigan Consistency: Merge Distortion Metric for Hierarchical Clustering

Justin Eldridge, Mikhail Belkin, Yusu Wang
Proceedings of The 28th Conference on Learning Theory, PMLR 40:588-606, 2015.

Abstract

Hierarchical clustering is a popular method for analyzing data which associates a tree to a dataset. Hartigan consistency has been used extensively as a framework to analyze such clustering algorithms from a statistical point of view. Still, as we show in the paper, a tree which is Hartigan consistent with a given density can look very different than the correct limit tree. Specifically, Hartigan consistency permits two types of undesirable configurations which we term \emphover-segmentation and \emphimproper nesting. Moreover, Hartigan consistency is a limit property and does not directly quantify difference between trees. In this paper we identify two limit properties, \emphseparation and \emphminimality, which address both over-segmentation and improper nesting and together imply (but are not implied by) Hartigan consistency. We proceed to introduce a \emphmerge distortion metric between hierarchical clusterings and show that convergence in our distance implies both separation and minimality. We also prove that uniform separation and minimality imply convergence in the merge distortion metric. Furthermore, we show that our merge distortion metric is stable under perturbations of the density. Finally, we demonstrate applicability of these concepts by proving convergence results for two clustering algorithms. First, we show convergence (and hence separation and minimality) of the recent robust single linkage algorithm of Chaudhuri and Dasgupta (2010). Second, we provide convergence results on manifolds for topological split tree clustering.

Cite this Paper


BibTeX
@InProceedings{pmlr-v40-Eldridge15, title = {Beyond Hartigan Consistency: Merge Distortion Metric for Hierarchical Clustering}, author = {Eldridge, Justin and Belkin, Mikhail and Wang, Yusu}, booktitle = {Proceedings of The 28th Conference on Learning Theory}, pages = {588--606}, year = {2015}, editor = {Grünwald, Peter and Hazan, Elad and Kale, Satyen}, volume = {40}, series = {Proceedings of Machine Learning Research}, address = {Paris, France}, month = {03--06 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v40/Eldridge15.pdf}, url = {https://proceedings.mlr.press/v40/Eldridge15.html}, abstract = {Hierarchical clustering is a popular method for analyzing data which associates a tree to a dataset. Hartigan consistency has been used extensively as a framework to analyze such clustering algorithms from a statistical point of view. Still, as we show in the paper, a tree which is Hartigan consistent with a given density can look very different than the correct limit tree. Specifically, Hartigan consistency permits two types of undesirable configurations which we term \emphover-segmentation and \emphimproper nesting. Moreover, Hartigan consistency is a limit property and does not directly quantify difference between trees. In this paper we identify two limit properties, \emphseparation and \emphminimality, which address both over-segmentation and improper nesting and together imply (but are not implied by) Hartigan consistency. We proceed to introduce a \emphmerge distortion metric between hierarchical clusterings and show that convergence in our distance implies both separation and minimality. We also prove that uniform separation and minimality imply convergence in the merge distortion metric. Furthermore, we show that our merge distortion metric is stable under perturbations of the density. Finally, we demonstrate applicability of these concepts by proving convergence results for two clustering algorithms. First, we show convergence (and hence separation and minimality) of the recent robust single linkage algorithm of Chaudhuri and Dasgupta (2010). Second, we provide convergence results on manifolds for topological split tree clustering.} }
Endnote
%0 Conference Paper %T Beyond Hartigan Consistency: Merge Distortion Metric for Hierarchical Clustering %A Justin Eldridge %A Mikhail Belkin %A Yusu Wang %B Proceedings of The 28th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2015 %E Peter Grünwald %E Elad Hazan %E Satyen Kale %F pmlr-v40-Eldridge15 %I PMLR %P 588--606 %U https://proceedings.mlr.press/v40/Eldridge15.html %V 40 %X Hierarchical clustering is a popular method for analyzing data which associates a tree to a dataset. Hartigan consistency has been used extensively as a framework to analyze such clustering algorithms from a statistical point of view. Still, as we show in the paper, a tree which is Hartigan consistent with a given density can look very different than the correct limit tree. Specifically, Hartigan consistency permits two types of undesirable configurations which we term \emphover-segmentation and \emphimproper nesting. Moreover, Hartigan consistency is a limit property and does not directly quantify difference between trees. In this paper we identify two limit properties, \emphseparation and \emphminimality, which address both over-segmentation and improper nesting and together imply (but are not implied by) Hartigan consistency. We proceed to introduce a \emphmerge distortion metric between hierarchical clusterings and show that convergence in our distance implies both separation and minimality. We also prove that uniform separation and minimality imply convergence in the merge distortion metric. Furthermore, we show that our merge distortion metric is stable under perturbations of the density. Finally, we demonstrate applicability of these concepts by proving convergence results for two clustering algorithms. First, we show convergence (and hence separation and minimality) of the recent robust single linkage algorithm of Chaudhuri and Dasgupta (2010). Second, we provide convergence results on manifolds for topological split tree clustering.
RIS
TY - CPAPER TI - Beyond Hartigan Consistency: Merge Distortion Metric for Hierarchical Clustering AU - Justin Eldridge AU - Mikhail Belkin AU - Yusu Wang BT - Proceedings of The 28th Conference on Learning Theory DA - 2015/06/26 ED - Peter Grünwald ED - Elad Hazan ED - Satyen Kale ID - pmlr-v40-Eldridge15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 40 SP - 588 EP - 606 L1 - http://proceedings.mlr.press/v40/Eldridge15.pdf UR - https://proceedings.mlr.press/v40/Eldridge15.html AB - Hierarchical clustering is a popular method for analyzing data which associates a tree to a dataset. Hartigan consistency has been used extensively as a framework to analyze such clustering algorithms from a statistical point of view. Still, as we show in the paper, a tree which is Hartigan consistent with a given density can look very different than the correct limit tree. Specifically, Hartigan consistency permits two types of undesirable configurations which we term \emphover-segmentation and \emphimproper nesting. Moreover, Hartigan consistency is a limit property and does not directly quantify difference between trees. In this paper we identify two limit properties, \emphseparation and \emphminimality, which address both over-segmentation and improper nesting and together imply (but are not implied by) Hartigan consistency. We proceed to introduce a \emphmerge distortion metric between hierarchical clusterings and show that convergence in our distance implies both separation and minimality. We also prove that uniform separation and minimality imply convergence in the merge distortion metric. Furthermore, we show that our merge distortion metric is stable under perturbations of the density. Finally, we demonstrate applicability of these concepts by proving convergence results for two clustering algorithms. First, we show convergence (and hence separation and minimality) of the recent robust single linkage algorithm of Chaudhuri and Dasgupta (2010). Second, we provide convergence results on manifolds for topological split tree clustering. ER -
APA
Eldridge, J., Belkin, M. & Wang, Y.. (2015). Beyond Hartigan Consistency: Merge Distortion Metric for Hierarchical Clustering. Proceedings of The 28th Conference on Learning Theory, in Proceedings of Machine Learning Research 40:588-606 Available from https://proceedings.mlr.press/v40/Eldridge15.html.

Related Material