Joint spatio-temporal features constrained self-supervised electrocardiogram representation learning

联合时空特征约束的自监督心电图表征学习

阅读:1

Abstract

The electrocardiogram (ECG) measurements with clinical diagnostic labels are intrinsically limited. We propose a generative learning based self-supervised method for general ECG representations applicable to various downstream tasks, thus achieving the goal of reducing the dependence on labeled data. However, existing self-supervised methods either fail to provide satisfactory ECG representations or require too much effort to curate a large amount of expert-annotated datasets. We propose a spatio-temporal joint detection based self-supervised method with little or no human supervision to label massive datasets. Considering the spatio-temporal characteristics of ECG signals, we dynamically randomly mask the original signal (temporal detection) and disrupt the order of leads (spatial detection) to complete the learning through reconstructing the original signal and predicting the lead numbers. To validate the effectiveness of the proposed method, we use several publicly available ECG databases as well as a private ECG data of ventricular tachycardia to pre-train our model. We use diagnostic classification of 27 arrhythmia types and localization of ventricular tachycardia origin sites as two downstream tasks, respectively. The results show that learning ECG representations with this method is effective. This effort demonstrates the feasibility of learning representations from ECG data by self-supervised learning. Our self-supervised method uses only 60% of the labeled data used by the supervised method to achieve the same performance. Using the same amount of data, our self-supervised approach shows 1.3% and 8.6% improvement in classification and localization accuracy compared to the model with random initialization on two types of downstream tasks, respectively.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。