Introducing a Deep Neural Network Model with Practical Implementation for Polyp Detection in Colonoscopy Videos

介绍一种用于结肠镜息肉检测的深度神经网络模型及其实际应用视频

阅读:1

Abstract

BACKGROUND: Deep learning has gained much attention in computer-assisted minimally invasive surgery in recent years. The application of deep-learning algorithms in colonoscopy can be divided into four main categories: surgical image analysis, surgical operations analysis, evaluation of surgical skills, and surgical automation. Analysis of surgical images by deep learning can be one of the main solutions for early detection of gastrointestinal lesions and for taking appropriate actions to treat cancer. METHOD: This study investigates a simple and accurate deep-learning model for polyp detection. We address the challenge of limited labeled data through transfer learning and employ multi-task learning to achieve both polyp classification and bounding box detection tasks. Considering the appropriate weight for each task in the total cost function is crucial in achieving the best results. Due to the lack of datasets with nonpolyp images, data collection was carried out. The proposed deep neural network structure was implemented on KVASIR-SEG and CVC-CLINIC datasets as polyp images in addition to the nonpolyp images extracted from the LDPolyp videos dataset. RESULTS: The proposed model demonstrated high accuracy, achieving 100% in polyp/non-polyp classification and 86% in bounding box detection. It also showed fast processing times (0.01 seconds), making it suitable for real-time clinical applications. CONCLUSION: The developed deep-learning model offers an efficient, accurate, and cost-effective solution for real-time polyp detection in colonoscopy. Its performance on benchmark datasets confirms its potential for clinical deployment, aiding in early cancer diagnosis and treatment.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。