Geometric evaluation of a deep learning method for segmentation of urinary OARs on magnetic resonance imaging for prostate cancer radiotherapy

对用于前列腺癌放射治疗的磁共振成像中泌尿系统危及器官分割的深度学习方法进行几何评估

阅读:1

Abstract

INTRODUCTION: While urinary organs at risk (OARs) such as the intraprostatic urethra and the bladder trigone are increasingly recognized as associated with severe genitourinary toxicity, their delineation in clinical practice is time consuming and probably associated with a large interobserver variability. The aim of this study was to propose a magnetic resonance (MR) deep learning segmentation of urinary OARs for prostate cancer (PCa) radiotherapy (RT), based on a validated atlas. MATERIAL AND METHODS: In this multicentric study, a convolutional neural network (CNN) for image segmentation (nnU-Net) was trained and validated on three image datasets. Two datasets came from MR-linac devices (Unity®, Elekta and MRIdian®, Viewray), and one dataset came from the PROSTATEx database (MAGNETOM® Trio and Skyra, Siemens). Evaluation of the deep learning segmentation was performed using dice score coefficients (DSC), surface distance (SD) and Hausdorff distance. RESULTS: A total of 265 MRI were analyzed. The mean DSC for all urinary structures was 0.88. The automatic segmentation model proved to be effective in the segmentation of the target volume and large OARs such as the bladder (mean DSC ranging of 0.95). Regarding urinary OARs, the mean DSC ranged between 0.50 and 0.68. The Hausdorff distance ranged between 4.0 mm to 10.3 mm for urinary OARs, highlighting local mismatches caused by large anatomical variations between patients. However, the SD ranged between 1.0 mm and 1.3 mm for urinary OARs, highlighting an overall good surface correlation for all organs. CONCLUSION: This multicentric study is the first to propose a nnU-Net deep learning model for the delineation of urinary OARs, that can be applied to various image dataset. Further work is needed to assess the dosimetric impact of such variations, in various clinical scenarios.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。