Robust colonoscopy polyp segmentation using dynamic-Nu T-Loss with multi-scale and uncertainty-aware adaptation

基于动态Nu T-Loss的多尺度和不确定性感知自适应方法的稳健结肠镜息肉分割

阅读:1

Abstract

Accurate segmentation of polyps in colonoscopy images is essential for early colorectal cancer detection; however, it remains a challenging task due to reflections, occlusions, motion artifacts, inter- and intra-polyp appearance variability, and the presence of noisy or inconsistent ground-truth annotations. In this work, we introduce dynamic-Nu T-Loss (DNA-TLoss), a robust, adaptive loss function based on the heavy-tailed Student's 𝑡-distribution that incorporates three novel extensions: (1) a per-image learnable degrees-of-freedom parameter ν, predicted by a lightweight NuPredictor network to dynamically adjust robustness to outliers; (2) per-pixel precision weights λ for spatially adaptive error sensitivity; and (3) a multi-scale aggregation scheme that computes and combines loss at multiple spatial resolutions to capture both coarse and fine details. Integrated into a U-Net with a ResNet-34 encoder, DNA-TLoss was evaluated on five public benchmarks: CVC-300, CVC-ClinicDB, ETIS-LaribPolypDB, Kvasir, and CVC-ColonDB. Our method achieves the lowest Hausdorff distance across all datasets, with an average reduction of 14.6% compared to T-Loss; notably, on CVC-300, it yields a significant decrease of 45.96%. It also obtains the lowest false discovery rate on all five datasets, improving over T-Loss by up to 38.7% on CVC-300 and 24.5% on Kvasir. Furthermore, DNA-TLoss provided best-in-class calibration, achieving expected calibration error as low as 0.44% on CVC-300 and outperforming all other baselines on four out of five datasets. These results highlight the promise of joint global and local uncertainty adaptation, coupled with multi-scale optimization, for advancing trustworthy, real-time computer-aided polyp detection in colonoscopy.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。