DI-UNet: dual-branch interactive U-Net for skin cancer image segmentation

DI-UNet:用于皮肤癌图像分割的双分支交互式 U-Net

阅读:1

Abstract

PURPOSE: Skin disease is a prevalent type of physical ailment that can manifest in multitude of forms. Many internal diseases can be directly reflected on the skin, and if left unattended, skin diseases can potentially develop into skin cancer. Accurate and effective segmentation of skin lesions, especially melanoma, is critical for early detection and diagnosis of skin cancer. However, the complex color variations, boundary ambiguity, and scale variations in skin lesion regions present significant challenges for precise segmentation. METHODS: We propose a novel approach for melanoma segmentation using a dual-branch interactive U-Net architecture. Two distinct sampling strategies are simultaneously integrated into the network, creating a vertical dual-branch structure. Meanwhile, we introduce a novel dual-channel symmetrical convolution block (DCS-Conv), which employs a symmetrical design, enabling the network to exhibit a horizontal dual-branch structure. The combination of the vertical and horizontal distribution of the dual-branch structure enhances both the depth and width of the network, providing greater diversity and rich multiscale cascade features. Additionally, this paper introduces a novel module called the residual fuse-and-select module (RFS module), which leverages self-attention mechanisms to focus on the specific skin cancer features and reduce irrelevant artifacts, further improving the segmentation accuracy. RESULTS: We evaluated our approach on two publicly skin cancer datasets, ISIC2016 and PH2, and achieved state-of-the-art results, surpassing previous outcomes in terms of segmentation accuracy and overall performance. CONCLUSION: Our proposed approach holds tremendous potential to aid dermatologists in clinical decision-making.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。