[edit]
Differentially Private Hierarchical Clustering with Provable Approximation Guarantees
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:14353-14375, 2023.
Abstract
Hierarchical Clustering is a popular unsupervised machine learning method with decades of history and numerous applications. We initiate the study of differentially-private approximation algorithms for hierarchical clustering under the rigorous framework introduced by Dasgupta (2016). We show strong lower bounds for the problem: that any ϵ-DP algorithm must exhibit O(|V|2/ϵ)-additive error for an input dataset V. Then, we exhibit a polynomial-time approximation algorithm with O(|V|2.5/ϵ)-additive error, and an exponential-time algorithm that meets the lower bound. To overcome the lower bound, we focus on the stochastic block model, a popular model of graphs, and, with a separation assumption on the blocks, propose a private 1+o(1) approximation algorithm which also recovers the blocks exactly. Finally, we perform an empirical study of our algorithms and validate their performance.