Self-Supervised Transformers for fMRI representation

Itzik Malkiel, Gony Rosenman, Lior Wolf, Talma Hendler
Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, PMLR 172:895-913, 2022.

Abstract

We present TFF, which is a Transformer framework for the analysis of functional Magnetic Resonance Imaging (fMRI) data. TFF employs a two-phase training approach. First, self-supervised training is applied to a collection of fMRI scans, where the model is trained to reconstruct 3D volume data. Second, the pre-trained model is fine-tuned on specific tasks, utilizing ground truth labels. Our results show state-of-the-art performance on a variety of fMRI tasks, including age and gender prediction, as well as schizophrenia recognition. Our code for the training, network architecture, and results is attached as supplementary material.

Cite this Paper


BibTeX
@InProceedings{pmlr-v172-malkiel22a, title = {Self-Supervised Transformers for fMRI representation}, author = {Malkiel, Itzik and Rosenman, Gony and Wolf, Lior and Hendler, Talma}, booktitle = {Proceedings of The 5th International Conference on Medical Imaging with Deep Learning}, pages = {895--913}, year = {2022}, editor = {Konukoglu, Ender and Menze, Bjoern and Venkataraman, Archana and Baumgartner, Christian and Dou, Qi and Albarqouni, Shadi}, volume = {172}, series = {Proceedings of Machine Learning Research}, month = {06--08 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v172/malkiel22a/malkiel22a.pdf}, url = {https://proceedings.mlr.press/v172/malkiel22a.html}, abstract = {We present TFF, which is a Transformer framework for the analysis of functional Magnetic Resonance Imaging (fMRI) data. TFF employs a two-phase training approach. First, self-supervised training is applied to a collection of fMRI scans, where the model is trained to reconstruct 3D volume data. Second, the pre-trained model is fine-tuned on specific tasks, utilizing ground truth labels. Our results show state-of-the-art performance on a variety of fMRI tasks, including age and gender prediction, as well as schizophrenia recognition. Our code for the training, network architecture, and results is attached as supplementary material.} }
Endnote
%0 Conference Paper %T Self-Supervised Transformers for fMRI representation %A Itzik Malkiel %A Gony Rosenman %A Lior Wolf %A Talma Hendler %B Proceedings of The 5th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2022 %E Ender Konukoglu %E Bjoern Menze %E Archana Venkataraman %E Christian Baumgartner %E Qi Dou %E Shadi Albarqouni %F pmlr-v172-malkiel22a %I PMLR %P 895--913 %U https://proceedings.mlr.press/v172/malkiel22a.html %V 172 %X We present TFF, which is a Transformer framework for the analysis of functional Magnetic Resonance Imaging (fMRI) data. TFF employs a two-phase training approach. First, self-supervised training is applied to a collection of fMRI scans, where the model is trained to reconstruct 3D volume data. Second, the pre-trained model is fine-tuned on specific tasks, utilizing ground truth labels. Our results show state-of-the-art performance on a variety of fMRI tasks, including age and gender prediction, as well as schizophrenia recognition. Our code for the training, network architecture, and results is attached as supplementary material.
APA
Malkiel, I., Rosenman, G., Wolf, L. & Hendler, T.. (2022). Self-Supervised Transformers for fMRI representation. Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 172:895-913 Available from https://proceedings.mlr.press/v172/malkiel22a.html.

Related Material