Initial application of deep learning to borescope detection of endoscope working channel damage and residue

深度学习在内窥镜工作通道损伤和残留物检测中的初步应用

阅读:1

Abstract

Background and study aims  Outbreaks of endoscopy-related infections have prompted evaluation for potential contributing factors. We and others have demonstrated the utility of borescope inspection of endoscope working channels to identify occult damage that may impact the adequacy of endoscope reprocessing. The time investment and training necessary for borescope inspection have been cited as barriers preventing implementation. We investigated the utility of artificial intelligence (AI) for streamlining and enhancing the value of borescope inspection of endoscope working channels. Methods  We applied a deep learning AI approach to borescope inspection videos of the working channels of 20 endoscopes in use at our academic institution. We evaluated the sensitivity, accuracy, and reliability of this software for detection of endoscope working channel findings. Results  Overall sensitivity for AI-based detection of borescope inspection findings identified by gold standard endoscopist inspection was 91.4 %. Labels were accurate for 67 % of these working channel findings and accuracy varied by endoscope segment. Read-to-read variability was noted to be minimal, with test-retest correlation value of 0.986. Endoscope type did not predict accuracy of the AI system ( P  = 0.26). Conclusions  Harnessing the power of AI for detection of endoscope working channel damage and residue could enable sterile processing department technicians to feasibly assess endoscopes for working channel damage and perform endoscope reprocessing surveillance. Endoscopes that accumulate an unacceptable level of damage may be flagged for further manual evaluation and consideration for manufacturer evaluation/repair.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。