An Unsupervised Learning-Based Regional Deformable Model for Automated Multi-Organ Contour Propagation

基于无监督学习的区域可变形模型用于自动多器官轮廓传播

阅读:2

Abstract

The aim of this study is to evaluate a regional deformable model based on a deep unsupervised learning model for automatic contour propagation in breast cone-beam computed tomography-guided adaptive radiation therapy. A deep unsupervised learning model was introduced to map breast's tumor bed, clinical target volume, heart, left lung, right lung, and spinal cord from planning computed tomography to cone-beam CT. To improve the traditional image registration method's performance, we used a regional deformable framework based on the narrow-band mapping, which can mitigate the effect of the image artifacts on the cone-beam CT. We retrospectively selected 373 anonymized cone-beam CT volumes from 111 patients with breast cancer. The cone-beam CTs are divided into three sets. 311 / 20 / 42 cone-beam CT images were used for training, validating, and testing. The manual contour was used as reference for the testing set. We compared the results between the reference and the model prediction for evaluating the performance. The mean Dice between manual reference segmentations and the model predicted segmentations for breast tumor bed, clinical target volume, heart, left lung, right lung, and spinal cord were 0.78 ± 0.09, 0.90 ± 0.03, 0.88 ± 0.04, 0.94 ± 0.03, 0.95 ± 0.02, and 0.77 ± 0.07, respectively. The results demonstrated a good agreement between the reference and the proposed contours. The proposed deep learning-based regional deformable model technique can automatically propagate contours for breast cancer adaptive radiotherapy. Deep learning in contour propagation was promising, but further investigation was warranted.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。