Automatic Segmentation of Vestibular Schwannoma From MRI Using Two Cascaded Deep Learning Networks

基于两个级联深度学习网络的MRI图像前庭神经鞘瘤自动分割

阅读:1

Abstract

OBJECTIVE: Automatic segmentation and detection of vestibular schwannoma (VS) in MRI by deep learning is an upcoming topic. However, deep learning faces generalization challenges due to tumor variability even though measurements and segmentation of VS are essential for growth monitoring and treatment planning. Therefore, we introduce a novel model combining two Convolutional Neural Network (CNN) models for the detection of VS by deep learning aiming to improve performance of automatic segmentation. METHODS: Deep learning techniques have been employed for automatic VS tumor segmentation, including 2D, 2.5D, and 3D UNet-like architectures, which is a specific CNN designed to improve automatic segmentation for medical imaging. Specifically, we introduce a sequential connection where the first UNet's predicted segmentation map is passed to a second complementary network for refinement. Additionally, spatial attention mechanisms are utilized to further guide refinement in the second network. RESULTS: We conducted experiments on both public and private datasets containing contrast-enhanced T1 and high-resolution T2-weighted magnetic resonance imaging (MRI). Across the public dataset, we observed consistent improvements in Dice scores for all variants of 2D, 2.5D, and 3D CNN methods, with a notable enhancement of 8.86% for the 2D UNet variant on T1. In our private dataset, a 3.75% improvement was reported for 2D T1. Moreover, we found that T1 images generally outperformed T2 in VS segmentation. CONCLUSION: We demonstrate that sequential connection of UNets combined with spatial attention mechanisms enhances VS segmentation performance across state-of-the-art 2D, 2.5D, and 3D deep learning methods. LEVEL OF EVIDENCE: 3 Laryngoscope, 135:1301-1308, 2025.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。