Comparative analysis of nnU-Net and Auto3Dseg for fat and fibroglandular tissue segmentation in MRI

nnU-Net 和 Auto3Dseg 在 MRI 中脂肪和纤维腺体组织分割方面的比较分析

阅读:3

Abstract

PURPOSE: Breast cancer, the most common cancer type among women worldwide, requires early detection and accurate diagnosis for improved treatment outcomes. Segmenting fat and fibroglandular tissue (FGT) in magnetic resonance imaging (MRI) is essential for creating volumetric models, enhancing surgical workflow, and improving clinical outcomes. Manual segmentation is time-consuming and subjective, prompting the development of automated deep-learning algorithms to perform this task. However, configuring these algorithms for 3D medical images is challenging due to variations in image features and preprocessing distortions. Automated machine learning (AutoML) frameworks automate model selection, hyperparameter tuning, and architecture optimization, offering a promising solution by reducing reliance on manual intervention and expert knowledge. APPROACH: We compare nnU-Net and Auto3Dseg, two AutoML frameworks, in segmenting fat and FGT on T1-weighted MRI images from the Duke breast MRI dataset (100 patients). We used threefold cross-validation, employing the Dice similarity coefficient (DSC) and Hausdorff distance (HD) metrics for evaluation. The F -test and Tukey honestly significant difference analysis were used to assess statistical differences across methods. RESULTS: nnU-Net achieved DSC scores of 0.946 ± 0.026 (fat) and 0.872 ± 0.070 (FGT), whereas Auto3DSeg achieved 0.940 ± 0.026 (fat) and 0.871 ± 0.074 (FGT). Significant differences in fat HD ( F = 6.3020 , p < 0.001 ) originated from the full resolution and the 3D cascade U-Net. No evidence of significant differences was found in FGT HD or DSC metrics. CONCLUSIONS: Ensemble approaches of Auto3Dseg and nnU-Net demonstrated comparable performance in segmenting fat and FGT on breast MRI. The significant differences in fat HD underscore the importance of boundary-focused metrics in evaluating segmentation methods.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。