Skin Malignancy Classification Using Patients’ Skin Images and Meta-data: Multimodal Fusion for Improving Fairness

Ke Wang, Ningyuan Shan, Henry Gouk, Iris Szu-Szu Ho
Proceedings of The 7nd International Conference on Medical Imaging with Deep Learning, PMLR 250:1670-1686, 2024.

Abstract

Skin cancer image classification across skin tones is a challenging problem due to the fact that skin cancer can present differently on different skin tones. This study evaluates the performance of image only models and fusion models in skin malignancy classification. The fusion models we consider are able to take in additional patient data, such as an indicator of their skin tone, and merge this information with the features provided by the image-only model. Results from the experiment show that fusion models perform substantially better than image-only models. In particular, we find that a form of multiplicative fusion results in the best performing models. This finding suggests that skin tones add predictive value in skin malignancy prediction problems. We further demonstrate that feature fusion methods reduce, but do not entirely eliminate, the disparity in performance of the model on patients with different skin tones.

Cite this Paper


BibTeX
@InProceedings{pmlr-v250-wang24c, title = {Skin Malignancy Classification Using Patients’ Skin Images and Meta-data: Multimodal Fusion for Improving Fairness}, author = {Wang, Ke and Shan, Ningyuan and Gouk, Henry and Ho, Iris Szu-Szu}, booktitle = {Proceedings of The 7nd International Conference on Medical Imaging with Deep Learning}, pages = {1670--1686}, year = {2024}, editor = {Burgos, Ninon and Petitjean, Caroline and Vakalopoulou, Maria and Christodoulidis, Stergios and Coupe, Pierrick and Delingette, Hervé and Lartizien, Carole and Mateus, Diana}, volume = {250}, series = {Proceedings of Machine Learning Research}, month = {03--05 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v250/main/assets/wang24c/wang24c.pdf}, url = {https://proceedings.mlr.press/v250/wang24c.html}, abstract = {Skin cancer image classification across skin tones is a challenging problem due to the fact that skin cancer can present differently on different skin tones. This study evaluates the performance of image only models and fusion models in skin malignancy classification. The fusion models we consider are able to take in additional patient data, such as an indicator of their skin tone, and merge this information with the features provided by the image-only model. Results from the experiment show that fusion models perform substantially better than image-only models. In particular, we find that a form of multiplicative fusion results in the best performing models. This finding suggests that skin tones add predictive value in skin malignancy prediction problems. We further demonstrate that feature fusion methods reduce, but do not entirely eliminate, the disparity in performance of the model on patients with different skin tones.} }
Endnote
%0 Conference Paper %T Skin Malignancy Classification Using Patients’ Skin Images and Meta-data: Multimodal Fusion for Improving Fairness %A Ke Wang %A Ningyuan Shan %A Henry Gouk %A Iris Szu-Szu Ho %B Proceedings of The 7nd International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2024 %E Ninon Burgos %E Caroline Petitjean %E Maria Vakalopoulou %E Stergios Christodoulidis %E Pierrick Coupe %E Hervé Delingette %E Carole Lartizien %E Diana Mateus %F pmlr-v250-wang24c %I PMLR %P 1670--1686 %U https://proceedings.mlr.press/v250/wang24c.html %V 250 %X Skin cancer image classification across skin tones is a challenging problem due to the fact that skin cancer can present differently on different skin tones. This study evaluates the performance of image only models and fusion models in skin malignancy classification. The fusion models we consider are able to take in additional patient data, such as an indicator of their skin tone, and merge this information with the features provided by the image-only model. Results from the experiment show that fusion models perform substantially better than image-only models. In particular, we find that a form of multiplicative fusion results in the best performing models. This finding suggests that skin tones add predictive value in skin malignancy prediction problems. We further demonstrate that feature fusion methods reduce, but do not entirely eliminate, the disparity in performance of the model on patients with different skin tones.
APA
Wang, K., Shan, N., Gouk, H. & Ho, I.S.. (2024). Skin Malignancy Classification Using Patients’ Skin Images and Meta-data: Multimodal Fusion for Improving Fairness. Proceedings of The 7nd International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 250:1670-1686 Available from https://proceedings.mlr.press/v250/wang24c.html.

Related Material