Automatic multi-needle localization in ultrasound images using large margin mask RCNN for ultrasound-guided prostate brachytherapy

基于大边界掩模RCNN的超声图像自动多针定位用于超声引导前列腺近距离放射治疗

阅读:1

Abstract

Multi-needle localization in ultrasound (US) images is a crucial step of treatment planning for US-guided prostate brachytherapy. However, current computer-aided technologies are mostly focused on single-needle digitization, while manual digitization is labor intensive and time consuming. In this paper, we proposed a deep learning-based workflow for fast automatic multi-needle digitization, including needle shaft detection and needle tip detection. The major workflow is composed of two components: a large margin mask R-CNN model (LMMask R-CNN), which adopts the lager margin loss to reformulate Mask R-CNN for needle shaft localization, and a needle based density-based spatial clustering of application with noise algorithm which integrates priors to model a needle in an iteration for a needle shaft refinement and tip detections. Besides, we use the skipping connection in neural network architecture to improve the supervision in hidden layers. Our workflow was evaluated on 23 patients who underwent US-guided high-dose-rate (HDR) prostrate brachytherapy with 339 needles being tested in total. Our method detected 98% of the needles with 0.091 ± 0.043 mm shaft error and 0.330 ± 0.363 mm tip error. Compared with only using Mask R-CNN and only using LMMask R-CNN, the proposed method gains a significant improvement on both shaft error and tip error. The proposed method automatically digitizes needles per patient with in a second. It streamlines the workflow of transrectal ultrasound-guided HDR prostate brachytherapy and paves the way for the development of real-time treatment planning system that is expected to further elevate the quality and outcome of HDR prostate brachytherapy.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。