Semi-Supervised Segmentation Framework for Gastrointestinal Lesion Diagnosis in Endoscopic Images

基于半监督的内镜图像胃肠道病变诊断分割框架

阅读:1

Abstract

BACKGROUND: Accurate gastrointestinal (GI) lesion segmentation is crucial in diagnosing digestive tract diseases. An automatic lesion segmentation in endoscopic images is vital to relieving physicians' burden and improving the survival rate of patients. However, pixel-wise annotations are highly intensive, especially in clinical settings, while numerous unlabeled image datasets could be available, although the significant results of deep learning approaches in several tasks heavily depend on large labeled datasets. Limited labeled data also hinder trained models' generalizability under fully supervised learning for computer-aided diagnosis (CAD) systems. METHODS: This work proposes a generative adversarial learning-based semi-supervised segmentation framework for GI lesion diagnosis in endoscopic images to tackle the challenge of limited annotations. The proposed approach leverages limited annotated and large unlabeled datasets in the training networks. We extensively tested the proposed method on 4880 endoscopic images. RESULTS: Compared with current related works, the proposed method validates better results (Dice similarity coefficient = 89.42 ± 3.92, Intersection over union = 80.04 ± 5.75, Precision = 91.72 ± 4.05, Recall = 90.11 ± 5.64, and Hausdorff distance = 23.28 ± 14.36) on the challenging multi-sited datasets, confirming the effectiveness of the proposed framework. CONCLUSION: We explore a semi-supervised lesion segmentation method to employ the full use of multiple unlabeled endoscopic images to improve lesion segmentation accuracy. Experimental results confirmed the potential of our method and outperformed promising results compared with the current related works. The proposed CAD system can minimize diagnostic errors.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。