MFS-Unet: A Multi-Path Vision Mamba Network for Precise Thyroid Nodule Segmentation

MFS-Unet:用于精确甲状腺结节分割的多路径视觉Mamba网络

阅读:1

Abstract

The automated segmentation of thyroid nodules from ultrasound images holds significant value in clinical diagnosis and treatment. However, achieving precise segmentation remains a substantial challenge due to issues such as blurred nodule boundaries, variable scales, image noise, and inaccurate annotations. To address these difficulties, this paper proposes a novel medical image segmentation network named MFS-Unet. The network introduces three innovative modules to enhance segmentation performance. First, we designed the multi-path vision mamba (MPV) module, which leverages the advantages of state space models (SSMs) to efficiently capture global contextual information and multi-scale features with linear computational complexity, effectively addressing the problem of significant variations in nodule size. Second, a feature gating (FG) module is deployed in the skip connections between the encoder and decoder. Through an attention mechanism, it dynamically screens and enhances features transmitted from the encoder, suppressing background noise and reinforcing key boundary information of the nodules. Finally, we propose a supervised label rectification (SLR) module, aimed at proactively handling the prevalent issue of label noise in training data. By dynamically adjusting loss weights during training, it guides the model to learn more robust feature representations. We conducted extensive experiments on three public thyroid ultrasound datasets: DDTI, TG3K, and TN3K. The results demonstrate that MFS-Unet achieves superior performance across all evaluation metrics compared with various state-of-the-art segmentation methods, proving its effectiveness and significant potential for precise thyroid nodule segmentation in complex ultrasound environments.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。