Advancements in prostate cancer segmentation: Integrating prostate zonal information

前列腺癌分割技术的进展:整合前列腺分区信息

阅读:1

Abstract

BACKGROUND AND OBJECTIVE: Prostate cancer is one of the most common cancers in men, and early diagnosis is critical. Segmentation of cancerous regions in multiparametric MRI is a key step. Deep neural networks, such as nnU-Net, perform well, and incorporating prostate zonal information may further improve accuracy. METHODS: This study introduces four prostate cancer segmentation ensembles that integrate zonal data, compared with a baseline model, which uses zonal information as a separate input channel. Ensembles employ specific prostate zone cancer segmentation models trained with the nnU-Net method. To address variability in manual annotations, a new evaluation metric, the tolerant Dice Score Coefficient (DSC (τ) ), is proposed, accounting for ground truth inaccuracies. RESULTS: Ensemble 3 yields the best performance, with a 4.77% higher mean DSC and 6.17% higher mean DSC (τ) than the baseline. Although the metrics of Ensemble 4 are slightly lower, it reduces false positives by 7.79% and uses fewer models (2 vs. 3), making it more efficient. Furthermore, the application of the Conover post hoc test for unreplicated blocked data shows that there is no statistically significant difference in performance metrics between the results of two ensembles. Thus, Ensemble 4 is the preferred approach for prostate cancer segmentation. Additionally, all ensembles achieve 5.03% to 7.13% higher mean DSC (τ) values compared to the standard DSC, confirming the effectiveness of the new metric in handling segmentation uncertainties. CONCLUSION: The experiment results indicate that the proposed Ensemble 4 is the most suitable solution for the prostate cancer segmentation task. Moreover, the results also indicate that the proposed metric, DSC (τ) , accounts for ground truth segmentation errors.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。