Lead-agnostic Self-supervised Learning for Local and Global Representations of Electrocardiogram

Jungwoo Oh, Hyunseung Chung, Joon-myoung Kwon, Dong-gyun Hong, Edward Choi
Proceedings of the Conference on Health, Inference, and Learning, PMLR 174:338-353, 2022.

Abstract

In recent years, self-supervised learning methods have shown significant improvement for pre-training with unlabeled data and have proven helpful for electrocardiogram signals. However, most previous pre-training methods for electrocardiogram focused on capturing only global contextual representations. This inhibits the models from learning fruitful representation of electrocardiogram, which results in poor performance on downstream tasks. Additionally, they cannot fine-tune the model with an arbitrary set of electrocardiogram leads unless the models were pre-trained on the same set of leads. In this work, we propose an ECG pre-training method that learns both local and global contextual representations for better generalizability and performance on downstream tasks. In addition, we propose random lead masking as an ECG-specific augmentation method to make our proposed model robust to an arbitrary set of leads. Experimental results on two downstream tasks, cardiac arrhythmia classification and patient identification, show that our proposed approach outperforms other state-of-the-art methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v174-oh22a, title = {Lead-agnostic Self-supervised Learning for Local and Global Representations of Electrocardiogram}, author = {Oh, Jungwoo and Chung, Hyunseung and Kwon, Joon-myoung and Hong, Dong-gyun and Choi, Edward}, booktitle = {Proceedings of the Conference on Health, Inference, and Learning}, pages = {338--353}, year = {2022}, editor = {Flores, Gerardo and Chen, George H and Pollard, Tom and Ho, Joyce C and Naumann, Tristan}, volume = {174}, series = {Proceedings of Machine Learning Research}, month = {07--08 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v174/oh22a/oh22a.pdf}, url = {https://proceedings.mlr.press/v174/oh22a.html}, abstract = {In recent years, self-supervised learning methods have shown significant improvement for pre-training with unlabeled data and have proven helpful for electrocardiogram signals. However, most previous pre-training methods for electrocardiogram focused on capturing only global contextual representations. This inhibits the models from learning fruitful representation of electrocardiogram, which results in poor performance on downstream tasks. Additionally, they cannot fine-tune the model with an arbitrary set of electrocardiogram leads unless the models were pre-trained on the same set of leads. In this work, we propose an ECG pre-training method that learns both local and global contextual representations for better generalizability and performance on downstream tasks. In addition, we propose random lead masking as an ECG-specific augmentation method to make our proposed model robust to an arbitrary set of leads. Experimental results on two downstream tasks, cardiac arrhythmia classification and patient identification, show that our proposed approach outperforms other state-of-the-art methods.} }
Endnote
%0 Conference Paper %T Lead-agnostic Self-supervised Learning for Local and Global Representations of Electrocardiogram %A Jungwoo Oh %A Hyunseung Chung %A Joon-myoung Kwon %A Dong-gyun Hong %A Edward Choi %B Proceedings of the Conference on Health, Inference, and Learning %C Proceedings of Machine Learning Research %D 2022 %E Gerardo Flores %E George H Chen %E Tom Pollard %E Joyce C Ho %E Tristan Naumann %F pmlr-v174-oh22a %I PMLR %P 338--353 %U https://proceedings.mlr.press/v174/oh22a.html %V 174 %X In recent years, self-supervised learning methods have shown significant improvement for pre-training with unlabeled data and have proven helpful for electrocardiogram signals. However, most previous pre-training methods for electrocardiogram focused on capturing only global contextual representations. This inhibits the models from learning fruitful representation of electrocardiogram, which results in poor performance on downstream tasks. Additionally, they cannot fine-tune the model with an arbitrary set of electrocardiogram leads unless the models were pre-trained on the same set of leads. In this work, we propose an ECG pre-training method that learns both local and global contextual representations for better generalizability and performance on downstream tasks. In addition, we propose random lead masking as an ECG-specific augmentation method to make our proposed model robust to an arbitrary set of leads. Experimental results on two downstream tasks, cardiac arrhythmia classification and patient identification, show that our proposed approach outperforms other state-of-the-art methods.
APA
Oh, J., Chung, H., Kwon, J., Hong, D. & Choi, E.. (2022). Lead-agnostic Self-supervised Learning for Local and Global Representations of Electrocardiogram. Proceedings of the Conference on Health, Inference, and Learning, in Proceedings of Machine Learning Research 174:338-353 Available from https://proceedings.mlr.press/v174/oh22a.html.

Related Material