Algorithms Trained on Normal Chest X-rays Can Predict Health Insurance Types

Chi-Yu Chen, Rawan Abulibdeh, Arash Asgari, Sebastián Andrés Cajas Ordóñez, Leo Anthony Celi, Deirdre Goode, Hassan Hamidi, Ned McCague, Laleh Seyyed-Kalantari, Thomas Sounack, Po-Chih Kuo
Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, PMLR 315:4166-4181, 2026.

Abstract

Artificial intelligence is revealing what medicine never intended to encode. Deep vision models, trained on chest X-rays, can now detect not only disease but also invisible traces of social inequality. In this study, we show that state-of-the-art architectures (DenseNet121, SwinV2-T, MedMamba) can predict a patient’s health insurance type, a strong proxy for socioeconomic status, from normal chest X-rays with significant accuracy (AUC $\approx$ 0.70 on MIMIC-CXR-JPG, 0.68 on CheXpert). The signal was unlikely contributed by demographic features by our machine learning study combining age, race, and sex labels to predict health insurance types. The signal also remains detectable when the model is trained exclusively on a single racial group. Patch-based occlusion reveals that the signal is diffuse rather than localized, embedded in the upper and mid-thoracic regions. This suggests that deep networks may be internalizing subtle traces of clinical environments, equipment differences, or care pathways; learning socioeconomic signals itself. These findings challenge the assumption that medical images are neutral biological data. By uncovering how models perceive and exploit these hidden social signatures, this work reframes fairness in medical AI: the goal is no longer only to balance datasets or adjust thresholds, but to interrogate and disentangle the social fingerprints embedded in clinical data itself.

Cite this Paper


BibTeX
@InProceedings{pmlr-v315-chen26c, title = {Algorithms Trained on Normal Chest X-rays Can Predict Health Insurance Types}, author = {Chen, Chi-Yu and Abulibdeh, Rawan and Asgari, Arash and Ord\'o\~nez, Sebasti\'an Andr\'es Cajas and Celi, Leo Anthony and Goode, Deirdre and Hamidi, Hassan and McCague, Ned and Seyyed-Kalantari, Laleh and Sounack, Thomas and Kuo, Po-Chih}, booktitle = {Proceedings of The 9th International Conference on Medical Imaging with Deep Learning}, pages = {4166--4181}, year = {2026}, editor = {Huo, Yuankai and Gao, Mingchen and Kuo, Chang-Fu and Jin, Yueming and Deng, Ruining}, volume = {315}, series = {Proceedings of Machine Learning Research}, month = {08--10 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v315/main/assets/chen26c/chen26c.pdf}, url = {https://proceedings.mlr.press/v315/chen26c.html}, abstract = {Artificial intelligence is revealing what medicine never intended to encode. Deep vision models, trained on chest X-rays, can now detect not only disease but also invisible traces of social inequality. In this study, we show that state-of-the-art architectures (DenseNet121, SwinV2-T, MedMamba) can predict a patient’s health insurance type, a strong proxy for socioeconomic status, from normal chest X-rays with significant accuracy (AUC $\approx$ 0.70 on MIMIC-CXR-JPG, 0.68 on CheXpert). The signal was unlikely contributed by demographic features by our machine learning study combining age, race, and sex labels to predict health insurance types. The signal also remains detectable when the model is trained exclusively on a single racial group. Patch-based occlusion reveals that the signal is diffuse rather than localized, embedded in the upper and mid-thoracic regions. This suggests that deep networks may be internalizing subtle traces of clinical environments, equipment differences, or care pathways; learning socioeconomic signals itself. These findings challenge the assumption that medical images are neutral biological data. By uncovering how models perceive and exploit these hidden social signatures, this work reframes fairness in medical AI: the goal is no longer only to balance datasets or adjust thresholds, but to interrogate and disentangle the social fingerprints embedded in clinical data itself.} }
Endnote
%0 Conference Paper %T Algorithms Trained on Normal Chest X-rays Can Predict Health Insurance Types %A Chi-Yu Chen %A Rawan Abulibdeh %A Arash Asgari %A Sebastián Andrés Cajas Ordóñez %A Leo Anthony Celi %A Deirdre Goode %A Hassan Hamidi %A Ned McCague %A Laleh Seyyed-Kalantari %A Thomas Sounack %A Po-Chih Kuo %B Proceedings of The 9th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2026 %E Yuankai Huo %E Mingchen Gao %E Chang-Fu Kuo %E Yueming Jin %E Ruining Deng %F pmlr-v315-chen26c %I PMLR %P 4166--4181 %U https://proceedings.mlr.press/v315/chen26c.html %V 315 %X Artificial intelligence is revealing what medicine never intended to encode. Deep vision models, trained on chest X-rays, can now detect not only disease but also invisible traces of social inequality. In this study, we show that state-of-the-art architectures (DenseNet121, SwinV2-T, MedMamba) can predict a patient’s health insurance type, a strong proxy for socioeconomic status, from normal chest X-rays with significant accuracy (AUC $\approx$ 0.70 on MIMIC-CXR-JPG, 0.68 on CheXpert). The signal was unlikely contributed by demographic features by our machine learning study combining age, race, and sex labels to predict health insurance types. The signal also remains detectable when the model is trained exclusively on a single racial group. Patch-based occlusion reveals that the signal is diffuse rather than localized, embedded in the upper and mid-thoracic regions. This suggests that deep networks may be internalizing subtle traces of clinical environments, equipment differences, or care pathways; learning socioeconomic signals itself. These findings challenge the assumption that medical images are neutral biological data. By uncovering how models perceive and exploit these hidden social signatures, this work reframes fairness in medical AI: the goal is no longer only to balance datasets or adjust thresholds, but to interrogate and disentangle the social fingerprints embedded in clinical data itself.
APA
Chen, C., Abulibdeh, R., Asgari, A., Ordóñez, S.A.C., Celi, L.A., Goode, D., Hamidi, H., McCague, N., Seyyed-Kalantari, L., Sounack, T. & Kuo, P.. (2026). Algorithms Trained on Normal Chest X-rays Can Predict Health Insurance Types. Proceedings of The 9th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 315:4166-4181 Available from https://proceedings.mlr.press/v315/chen26c.html.

Related Material