MRI k-Space Motion Artefact Augmentation: Model Robustness and Task-Specific Uncertainty

Richard Shaw, Carole Sudre, Sebastien Ourselin, M. Jorge Cardoso
Proceedings of The 2nd International Conference on Medical Imaging with Deep Learning, PMLR 102:427-436, 2019.

Abstract

Patient movement during the acquisition of magnetic resonance images (MRI) can cause unwanted image artefacts. These artefacts may affect the quality of diagnosis by clinicians and cause errors in automated image analysis. In this work, we present a method for generating realistic motion artefacts from artefact-free data to be used in deep learning frameworks to increase training appearance variability and ultimately make machine learning algorithms such as convolutional neural networks (CNNs) robust to the presence of motion artefacts. We model patient movement as a sequence of randomly-generated, ‘de-meaned’, rigid 3D affine transforms which, by resampling artefact-free volumes, are then combined in k-space to generate realistic motion artefacts. We show that by augmenting the training of semantic segmentation CNNs with artefacted data, we can train models that generalise better and perform more reliably in the presence of artefacted data, with negligible cost to their performance on artefact-free data. We show that the performance of models trained using artefacted data on segmentation tasks on real-world test-retest image pairs is more robust. Finally, we demonstrate that measures of uncertainty obtained from motion augmented models reflect the presence of artefacts and can thus provide relevant information to ensure the safe usage of deep learning extracted biomarkers in clinics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v102-shaw19a, title = {MRI k-Space Motion Artefact Augmentation: Model Robustness and Task-Specific Uncertainty}, author = {Shaw, Richard and Sudre, Carole and Ourselin, Sebastien and Cardoso, M. Jorge}, booktitle = {Proceedings of The 2nd International Conference on Medical Imaging with Deep Learning}, pages = {427--436}, year = {2019}, editor = {Cardoso, M. Jorge and Feragen, Aasa and Glocker, Ben and Konukoglu, Ender and Oguz, Ipek and Unal, Gozde and Vercauteren, Tom}, volume = {102}, series = {Proceedings of Machine Learning Research}, month = {08--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v102/shaw19a/shaw19a.pdf}, url = {https://proceedings.mlr.press/v102/shaw19a.html}, abstract = {Patient movement during the acquisition of magnetic resonance images (MRI) can cause unwanted image artefacts. These artefacts may affect the quality of diagnosis by clinicians and cause errors in automated image analysis. In this work, we present a method for generating realistic motion artefacts from artefact-free data to be used in deep learning frameworks to increase training appearance variability and ultimately make machine learning algorithms such as convolutional neural networks (CNNs) robust to the presence of motion artefacts. We model patient movement as a sequence of randomly-generated, ‘de-meaned’, rigid 3D affine transforms which, by resampling artefact-free volumes, are then combined in k-space to generate realistic motion artefacts. We show that by augmenting the training of semantic segmentation CNNs with artefacted data, we can train models that generalise better and perform more reliably in the presence of artefacted data, with negligible cost to their performance on artefact-free data. We show that the performance of models trained using artefacted data on segmentation tasks on real-world test-retest image pairs is more robust. Finally, we demonstrate that measures of uncertainty obtained from motion augmented models reflect the presence of artefacts and can thus provide relevant information to ensure the safe usage of deep learning extracted biomarkers in clinics.} }
Endnote
%0 Conference Paper %T MRI k-Space Motion Artefact Augmentation: Model Robustness and Task-Specific Uncertainty %A Richard Shaw %A Carole Sudre %A Sebastien Ourselin %A M. Jorge Cardoso %B Proceedings of The 2nd International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2019 %E M. Jorge Cardoso %E Aasa Feragen %E Ben Glocker %E Ender Konukoglu %E Ipek Oguz %E Gozde Unal %E Tom Vercauteren %F pmlr-v102-shaw19a %I PMLR %P 427--436 %U https://proceedings.mlr.press/v102/shaw19a.html %V 102 %X Patient movement during the acquisition of magnetic resonance images (MRI) can cause unwanted image artefacts. These artefacts may affect the quality of diagnosis by clinicians and cause errors in automated image analysis. In this work, we present a method for generating realistic motion artefacts from artefact-free data to be used in deep learning frameworks to increase training appearance variability and ultimately make machine learning algorithms such as convolutional neural networks (CNNs) robust to the presence of motion artefacts. We model patient movement as a sequence of randomly-generated, ‘de-meaned’, rigid 3D affine transforms which, by resampling artefact-free volumes, are then combined in k-space to generate realistic motion artefacts. We show that by augmenting the training of semantic segmentation CNNs with artefacted data, we can train models that generalise better and perform more reliably in the presence of artefacted data, with negligible cost to their performance on artefact-free data. We show that the performance of models trained using artefacted data on segmentation tasks on real-world test-retest image pairs is more robust. Finally, we demonstrate that measures of uncertainty obtained from motion augmented models reflect the presence of artefacts and can thus provide relevant information to ensure the safe usage of deep learning extracted biomarkers in clinics.
APA
Shaw, R., Sudre, C., Ourselin, S. & Cardoso, M.J.. (2019). MRI k-Space Motion Artefact Augmentation: Model Robustness and Task-Specific Uncertainty. Proceedings of The 2nd International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 102:427-436 Available from https://proceedings.mlr.press/v102/shaw19a.html.

Related Material