An ultrasound image segmentation method for thyroid nodules based on dual-path attention mechanism-enhanced UNet+

一种基于双路径注意力机制增强型UNet+的甲状腺结节超声图像分割方法

阅读:1

Abstract

PURPOSE: This study aims to design an auxiliary segmentation model for thyroid nodules to increase diagnostic accuracy and efficiency, thereby reducing the workload of medical personnel. METHODS: This study proposes a Dual-Path Attention Mechanism (DPAM)-UNet++ model, which can automatically segment thyroid nodules in ultrasound images. Specifically, the model incorporates dual-path attention modules into the skip connections of the UNet++ network to capture global contextual information in feature maps. The model's performance was evaluated using Intersection over Union (IoU), F1_score, accuracy, etc. Additionally, a new integrated loss function was designed for the DPAM-UNet++ network. RESULTS: Comparative experiments with classical segmentation models revealed that the DPAM-UNet++ model achieved an IoU of 0.7451, an F1_score of 0.8310, an accuracy of 0.9718, a precision of 0.8443, a recall of 0.8702, an Area Under Curve (AUC) of 0.9213, and an HD95 of 35.31. Except for the precision metric, this model outperformed the other models on all the indicators and achieved a segmentation effect that was more similar to that of the ground truth labels. Additionally, ablation experiments verified the effectiveness and necessity of the dual-path attention mechanism and the integrated loss function. CONCLUSION: The segmentation model proposed in this study can effectively capture global contextual information in ultrasound images and accurately identify the locations of nodule areas. The model yields excellent segmentation results, especially for small and multiple nodules. Additionally, the integrated loss function improves the segmentation of nodule edges, enhancing the model's accuracy in segmenting edge details.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。