A foundational transformer leveraging full night, multichannel sleep study data accurately classifies sleep stages

一款利用整夜多通道睡眠研究数据的基础型转换器,能够准确地对睡眠阶段进行分类。

阅读:1

Abstract

STUDY OBJECTIVES: To evaluate whether a foundational transformer using 8-hour, multichannel polysomnogram (PSG) data can effectively encode signals and classify sleep stages with state-of-the-art performance. METHODS: The Sleep Heart Health Study, Wisconsin Sleep Cohort, and Osteoporotic Fractures in Men (MrOS) Study visit 1 were used for training, and the Multi-Ethnic Study of Atherosclerosis (MESA), Apnea Positive Pressure Long-term Efficacy Study (APPLES), and MrOS visit 2 served as independent test sets. We developed PFTSleep, a self-supervised foundational transformer that encodes full-night sleep studies with brain, movement, cardiac, oxygen, and respiratory channels. These representations were used to train another model to classify sleep stages. We compared our results to existing methods, examined differences in performance by varying channel input data and training dataset size, and investigated an AI explainability tool to analyze decision processes. RESULTS: PFTSleep was trained with 13 888 sleep studies and tested on 4169 independent studies. Cohen's Kappa scores were 0.81 for our held-out set, 0.59 for APPLES, 0.60 for MESA, and 0.75 for MrOS Visit 2. Performance increases to 0.76 on a held-out MESA set when MESA is included in the training of the classifier head but not the transformer. Compared to other state-of-the-art AI models, our model shows high performance across diverse datasets while only using task-agnostic PSG representations from a foundational transformer as input for sleep stage classification. CONCLUSIONS: Full night, multichannel PSG representations from a foundational transformer enable accurate sleep stage classification comparable to state-of-the-art AI methods across diverse datasets.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。