Automatic Segmentation of Head and Neck Tumors and Nodal Metastases in PET-CT scans

Vincent Andrearczyk, Valentin Oreiller, Martin Vallières, Joel Castelli, Hesham Elhalawani, Mario Jreige, Sarah Boughdad, John O. Prior, Adrien Depeursinge
Proceedings of the Third Conference on Medical Imaging with Deep Learning, PMLR 121:33-43, 2020.

Abstract

Radiomics, the prediction of disease characteristics using quantitative image biomarkers from medical images, relies on expensive manual annotations of Regions of Interest (ROI) to focus the analysis. In this paper, we propose an automatic segmentation of Head and Neck (H$&$N) tumors and nodal metastases from FDG-PET and CT images. A fully-convolutional network (2D and 3D V-Net) is trained on PET-CT images using ground truth ROIs that were manually delineated by radiation oncologists for 202 patients. The results show the complementarity of the two modalities with a statistically significant improvement from 48.7$%$ and 58.2$%$ Dice Score Coefficients (DSC) with CT- and PET-only segmentation respectively, to 60.6$%$ with a bimodal late fusion approach. We also note that, on this task, a 2D implementation slightly outperforms a similar 3D design (60.6$%$ vs 59.7$%$ for the best results respectively).

Cite this Paper


BibTeX
@InProceedings{pmlr-v121-andrearczyk20a, title = {Automatic Segmentation of Head and Neck Tumors and Nodal Metastases in PET-CT scans}, author = {Andrearczyk, Vincent and Oreiller, Valentin and Valli\`eres, Martin and Castelli, Joel and Elhalawani, Hesham and Jreige, Mario and Boughdad, Sarah and Prior, John O. and Depeursinge, Adrien}, booktitle = {Proceedings of the Third Conference on Medical Imaging with Deep Learning}, pages = {33--43}, year = {2020}, editor = {Arbel, Tal and Ben Ayed, Ismail and de Bruijne, Marleen and Descoteaux, Maxime and Lombaert, Herve and Pal, Christopher}, volume = {121}, series = {Proceedings of Machine Learning Research}, month = {06--08 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v121/andrearczyk20a/andrearczyk20a.pdf}, url = {https://proceedings.mlr.press/v121/andrearczyk20a.html}, abstract = {Radiomics, the prediction of disease characteristics using quantitative image biomarkers from medical images, relies on expensive manual annotations of Regions of Interest (ROI) to focus the analysis. In this paper, we propose an automatic segmentation of Head and Neck (H$&$N) tumors and nodal metastases from FDG-PET and CT images. A fully-convolutional network (2D and 3D V-Net) is trained on PET-CT images using ground truth ROIs that were manually delineated by radiation oncologists for 202 patients. The results show the complementarity of the two modalities with a statistically significant improvement from 48.7$%$ and 58.2$%$ Dice Score Coefficients (DSC) with CT- and PET-only segmentation respectively, to 60.6$%$ with a bimodal late fusion approach. We also note that, on this task, a 2D implementation slightly outperforms a similar 3D design (60.6$%$ vs 59.7$%$ for the best results respectively).} }
Endnote
%0 Conference Paper %T Automatic Segmentation of Head and Neck Tumors and Nodal Metastases in PET-CT scans %A Vincent Andrearczyk %A Valentin Oreiller %A Martin Vallières %A Joel Castelli %A Hesham Elhalawani %A Mario Jreige %A Sarah Boughdad %A John O. Prior %A Adrien Depeursinge %B Proceedings of the Third Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2020 %E Tal Arbel %E Ismail Ben Ayed %E Marleen de Bruijne %E Maxime Descoteaux %E Herve Lombaert %E Christopher Pal %F pmlr-v121-andrearczyk20a %I PMLR %P 33--43 %U https://proceedings.mlr.press/v121/andrearczyk20a.html %V 121 %X Radiomics, the prediction of disease characteristics using quantitative image biomarkers from medical images, relies on expensive manual annotations of Regions of Interest (ROI) to focus the analysis. In this paper, we propose an automatic segmentation of Head and Neck (H$&$N) tumors and nodal metastases from FDG-PET and CT images. A fully-convolutional network (2D and 3D V-Net) is trained on PET-CT images using ground truth ROIs that were manually delineated by radiation oncologists for 202 patients. The results show the complementarity of the two modalities with a statistically significant improvement from 48.7$%$ and 58.2$%$ Dice Score Coefficients (DSC) with CT- and PET-only segmentation respectively, to 60.6$%$ with a bimodal late fusion approach. We also note that, on this task, a 2D implementation slightly outperforms a similar 3D design (60.6$%$ vs 59.7$%$ for the best results respectively).
APA
Andrearczyk, V., Oreiller, V., Vallières, M., Castelli, J., Elhalawani, H., Jreige, M., Boughdad, S., Prior, J.O. & Depeursinge, A.. (2020). Automatic Segmentation of Head and Neck Tumors and Nodal Metastases in PET-CT scans. Proceedings of the Third Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 121:33-43 Available from https://proceedings.mlr.press/v121/andrearczyk20a.html.

Related Material