Regularizing Brain Age Prediction via Gated Knowledge Distillation

Yanwu Yang, Guo Xutao, Chenfei Ye, Yang Xiang, Ting Ma
Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, PMLR 172:1430-1443, 2022.

Abstract

The brain age has been proven a phenotype with relevance to cognitive performance and brain disease. With the development of deep learning, brain age estimation accuracy has been greatly improved. However, such methods may incur over-fitting and suffer from poor generalizations, especially for insufficient brain imaging data. This paper presents a novel regularization method that penalizes the predictive distribution using knowledge distillation and introduces additional knowledge to reinforce the learning process. During knowledge distillation, we propose a gated distillation mechanism to enable the student model to attentively learn key knowledge from the teacher model, given the assumption that the teacher may not always be correct. Moreover, to enhance the capability of knowledge transfer, the hint representation similarity is also adopted to regularize the model training. We evaluate the model by a cohort of 3655 subjects from 4 public datasets, demonstrating that the proposed method improves the prediction performance over several well-established models, where the mean absolute error of the estimated ages is 2.129 years.

Cite this Paper


BibTeX
@InProceedings{pmlr-v172-yang22a, title = {Regularizing Brain Age Prediction via Gated Knowledge Distillation}, author = {Yang, Yanwu and Xutao, Guo and Ye, Chenfei and Xiang, Yang and Ma, Ting}, booktitle = {Proceedings of The 5th International Conference on Medical Imaging with Deep Learning}, pages = {1430--1443}, year = {2022}, editor = {Konukoglu, Ender and Menze, Bjoern and Venkataraman, Archana and Baumgartner, Christian and Dou, Qi and Albarqouni, Shadi}, volume = {172}, series = {Proceedings of Machine Learning Research}, month = {06--08 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v172/yang22a/yang22a.pdf}, url = {https://proceedings.mlr.press/v172/yang22a.html}, abstract = {The brain age has been proven a phenotype with relevance to cognitive performance and brain disease. With the development of deep learning, brain age estimation accuracy has been greatly improved. However, such methods may incur over-fitting and suffer from poor generalizations, especially for insufficient brain imaging data. This paper presents a novel regularization method that penalizes the predictive distribution using knowledge distillation and introduces additional knowledge to reinforce the learning process. During knowledge distillation, we propose a gated distillation mechanism to enable the student model to attentively learn key knowledge from the teacher model, given the assumption that the teacher may not always be correct. Moreover, to enhance the capability of knowledge transfer, the hint representation similarity is also adopted to regularize the model training. We evaluate the model by a cohort of 3655 subjects from 4 public datasets, demonstrating that the proposed method improves the prediction performance over several well-established models, where the mean absolute error of the estimated ages is 2.129 years.} }
Endnote
%0 Conference Paper %T Regularizing Brain Age Prediction via Gated Knowledge Distillation %A Yanwu Yang %A Guo Xutao %A Chenfei Ye %A Yang Xiang %A Ting Ma %B Proceedings of The 5th International Conference on Medical Imaging with Deep Learning %C Proceedings of Machine Learning Research %D 2022 %E Ender Konukoglu %E Bjoern Menze %E Archana Venkataraman %E Christian Baumgartner %E Qi Dou %E Shadi Albarqouni %F pmlr-v172-yang22a %I PMLR %P 1430--1443 %U https://proceedings.mlr.press/v172/yang22a.html %V 172 %X The brain age has been proven a phenotype with relevance to cognitive performance and brain disease. With the development of deep learning, brain age estimation accuracy has been greatly improved. However, such methods may incur over-fitting and suffer from poor generalizations, especially for insufficient brain imaging data. This paper presents a novel regularization method that penalizes the predictive distribution using knowledge distillation and introduces additional knowledge to reinforce the learning process. During knowledge distillation, we propose a gated distillation mechanism to enable the student model to attentively learn key knowledge from the teacher model, given the assumption that the teacher may not always be correct. Moreover, to enhance the capability of knowledge transfer, the hint representation similarity is also adopted to regularize the model training. We evaluate the model by a cohort of 3655 subjects from 4 public datasets, demonstrating that the proposed method improves the prediction performance over several well-established models, where the mean absolute error of the estimated ages is 2.129 years.
APA
Yang, Y., Xutao, G., Ye, C., Xiang, Y. & Ma, T.. (2022). Regularizing Brain Age Prediction via Gated Knowledge Distillation. Proceedings of The 5th International Conference on Medical Imaging with Deep Learning, in Proceedings of Machine Learning Research 172:1430-1443 Available from https://proceedings.mlr.press/v172/yang22a.html.

Related Material