An improved multi-scale feature extraction network for medical image segmentation

一种改进的多尺度特征提取网络用于医学图像分割

阅读:1

Abstract

BACKGROUND: The use of U-Net and its variations has led to significant advancements in medical image segmentation. However, the encoder-decoder structures of these models often lose spatial information during downsampling. Skip connections can help address this issue; however, they may also introduce excessive irrelevant background information. Additionally, medical images display significant scale variations and complex tissue structures, making it challenging for existing models to accurately separate tissues from the background. To address these issues, we developed the Res2Net-ConvFormer-Dilation-UNet (Res2-CD-UNet), a multi-scale feature extraction network for medical image segmentation. METHODS: This study presents a novel U-shaped segmentation network that employs Res2Net as the backbone and incorporates a convolution-style transformer in the encoding stage to enhance global attention. Additionally, a novel channel feature fusion block (CFFB) has been introduced in the skip connection stage to minimize the effects of background noise. RESULTS: The proposed model was evaluated using publicly available datasets, Synapse and Seg.A.2023. Using the Synapse dataset, the average dice similarity coefficient (DSC) reached 83.92%, which was 1.96% higher than the suboptimal model, and the average Hausdorff distance (HD) was 14.51 mm. Among the eight organs evaluated, optimal results were achieved for four organs. Similarly, using the Seg.A.2023 dataset, the proposed model also achieved the best results with an average DSC of 93.27%. CONCLUSIONS: The results of this study indicate that the proposed model can more accurately segment regions of interest and better extract multi-scale features in medical images than existing deep-learning algorithms.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。