Fairness Risks for Group-Conditionally Missing Demographics

Kaiqi Jiang, Wenzhe Fan, Mao Li, Xinhua Zhang
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3520-3528, 2025.

Abstract

Fairness-aware classification models have gained increasing attention in recent years as concerns grow on discrimination against some demographic groups. Most existing models require full knowledge of the sensitive features, which can be impractical due to privacy, legal issues, and an individual’s fear of discrimination. The key challenge we will address is the group dependency of the unavailability, e.g., people of some age range may be more reluctant to reveal their age. Our solution augments general fairness risks with probabilistic imputations of the sensitive features, while jointly learning the group-conditionally missing probabilities in a variational auto-encoder. Our model is demonstrated effective on both image and tabular datasets, achieving an improved balance between accuracy and fairness.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-jiang25b, title = {Fairness Risks for Group-Conditionally Missing Demographics}, author = {Jiang, Kaiqi and Fan, Wenzhe and Li, Mao and Zhang, Xinhua}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3520--3528}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/jiang25b/jiang25b.pdf}, url = {https://proceedings.mlr.press/v258/jiang25b.html}, abstract = {Fairness-aware classification models have gained increasing attention in recent years as concerns grow on discrimination against some demographic groups. Most existing models require full knowledge of the sensitive features, which can be impractical due to privacy, legal issues, and an individual’s fear of discrimination. The key challenge we will address is the group dependency of the unavailability, e.g., people of some age range may be more reluctant to reveal their age. Our solution augments general fairness risks with probabilistic imputations of the sensitive features, while jointly learning the group-conditionally missing probabilities in a variational auto-encoder. Our model is demonstrated effective on both image and tabular datasets, achieving an improved balance between accuracy and fairness.} }
Endnote
%0 Conference Paper %T Fairness Risks for Group-Conditionally Missing Demographics %A Kaiqi Jiang %A Wenzhe Fan %A Mao Li %A Xinhua Zhang %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-jiang25b %I PMLR %P 3520--3528 %U https://proceedings.mlr.press/v258/jiang25b.html %V 258 %X Fairness-aware classification models have gained increasing attention in recent years as concerns grow on discrimination against some demographic groups. Most existing models require full knowledge of the sensitive features, which can be impractical due to privacy, legal issues, and an individual’s fear of discrimination. The key challenge we will address is the group dependency of the unavailability, e.g., people of some age range may be more reluctant to reveal their age. Our solution augments general fairness risks with probabilistic imputations of the sensitive features, while jointly learning the group-conditionally missing probabilities in a variational auto-encoder. Our model is demonstrated effective on both image and tabular datasets, achieving an improved balance between accuracy and fairness.
APA
Jiang, K., Fan, W., Li, M. & Zhang, X.. (2025). Fairness Risks for Group-Conditionally Missing Demographics. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3520-3528 Available from https://proceedings.mlr.press/v258/jiang25b.html.

Related Material