Deep learning models for segmentation and quantification of left atrial appendage volume using noncontrast cardiac computed tomography

利用非对比增强心脏计算机断层扫描技术对左心耳体积进行分割和量化的深度学习模型

阅读:2

Abstract

BACKGROUND: The left atrial appendage (LAA) is a critical but frequently overlooked site of thrombus formation, reinforcing the need for accurate identification in routine cardiac imaging. This process is related to pathological dilation associated with endothelial injury and a proinflammatory status. This study assesses the performance of deep learning architectures based on U-Net, specifically UNet3D, Residual-UNet3D, 3D Attention-UNet, and Res16-PAC-UNet, in the semiautomated segmentation and volume measurement of LAA. METHODS: We retrospectively analyzed noncontrast cardiac computed tomography (NCCT) scans from 452 patients aged ≥ 60 years, acquired for chest pain evaluation, to compare the performance of four U-Net-based deep learning architectures (UNet3D, Residual-UNet3D, 3D Attention-UNet, and Res16-PAC-UNet) for semiautomated LAA segmentation and volume measurement. Segmentation accuracy was assessed with the Dice coefficient, and volumetric agreement with Pearson correlation and Bland-Altman analysis. RESULTS: Dice coefficients were 78.44 ± 1.93 for UNet3D, 78.97 ± 0.79 for Residual-UNet3D, 79.07 ± 1.43 for 3D Attention-UNet, and 77.68 ± 1.47 for Res16-PAC-UNet. All models showed strong correlations between predicted and manual volumes (P < 0.001), with the highest in 3D Attention-UNet (r = 0.800). Bland-Altman analysis indicated minimal bias and narrow limits of agreement for all architectures, confirming consistent reliability. CONCLUSIONS: Deep learning-based segmentation on NCCT enables accurate, reproducible LAA morphological and volumetric assessment without contrast, offering a rapid and reliable tool to support cardiovascular risk stratification and treatment planning.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。