Generalized Reductions: Making any Hierarchical Clustering Fair and Balanced with Low Cost

Marina Knittel, Max Springer, John P Dickerson, Mohammadtaghi Hajiaghayi
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:17218-17242, 2023.

Abstract

Clustering is a fundamental building block of modern statistical analysis pipelines. Fair clustering has seen much attention from the machine learning community in recent years. We are some of the first to study fairness in the context of hierarchical clustering, after the results of Ahmadian et al. from NeurIPS in 2020. We evaluate our results using Dasgupta’s cost function, perhaps one of the most prevalent theoretical metrics for hierarchical clustering evaluation. Our work vastly improves the previous $O(n^{5/6}poly\log(n))$ fair approximation for cost to a near polylogarithmic $O(n^\delta poly\log(n))$ fair approximation for any constant $\delta\in(0,1)$. This result establishes a cost fairness tradeoff and extends to broader fairness constraints than the previous work. We also show how to alter existing hierarchical clusterings to guarantee fairness and cluster balance across any level in the hierarchy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-knittel23a, title = {Generalized Reductions: Making any Hierarchical Clustering Fair and Balanced with Low Cost}, author = {Knittel, Marina and Springer, Max and Dickerson, John P and Hajiaghayi, Mohammadtaghi}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {17218--17242}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/knittel23a/knittel23a.pdf}, url = {https://proceedings.mlr.press/v202/knittel23a.html}, abstract = {Clustering is a fundamental building block of modern statistical analysis pipelines. Fair clustering has seen much attention from the machine learning community in recent years. We are some of the first to study fairness in the context of hierarchical clustering, after the results of Ahmadian et al. from NeurIPS in 2020. We evaluate our results using Dasgupta’s cost function, perhaps one of the most prevalent theoretical metrics for hierarchical clustering evaluation. Our work vastly improves the previous $O(n^{5/6}poly\log(n))$ fair approximation for cost to a near polylogarithmic $O(n^\delta poly\log(n))$ fair approximation for any constant $\delta\in(0,1)$. This result establishes a cost fairness tradeoff and extends to broader fairness constraints than the previous work. We also show how to alter existing hierarchical clusterings to guarantee fairness and cluster balance across any level in the hierarchy.} }
Endnote
%0 Conference Paper %T Generalized Reductions: Making any Hierarchical Clustering Fair and Balanced with Low Cost %A Marina Knittel %A Max Springer %A John P Dickerson %A Mohammadtaghi Hajiaghayi %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-knittel23a %I PMLR %P 17218--17242 %U https://proceedings.mlr.press/v202/knittel23a.html %V 202 %X Clustering is a fundamental building block of modern statistical analysis pipelines. Fair clustering has seen much attention from the machine learning community in recent years. We are some of the first to study fairness in the context of hierarchical clustering, after the results of Ahmadian et al. from NeurIPS in 2020. We evaluate our results using Dasgupta’s cost function, perhaps one of the most prevalent theoretical metrics for hierarchical clustering evaluation. Our work vastly improves the previous $O(n^{5/6}poly\log(n))$ fair approximation for cost to a near polylogarithmic $O(n^\delta poly\log(n))$ fair approximation for any constant $\delta\in(0,1)$. This result establishes a cost fairness tradeoff and extends to broader fairness constraints than the previous work. We also show how to alter existing hierarchical clusterings to guarantee fairness and cluster balance across any level in the hierarchy.
APA
Knittel, M., Springer, M., Dickerson, J.P. & Hajiaghayi, M.. (2023). Generalized Reductions: Making any Hierarchical Clustering Fair and Balanced with Low Cost. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:17218-17242 Available from https://proceedings.mlr.press/v202/knittel23a.html.

Related Material