Comparative analysis of transformer, CNN, and YOLO architectures for mandibular condyle segmentation on panoramic radiographs: a deep learning benchmark

对Transformer、CNN和YOLO架构在全景X光片下颌髁突分割中的应用进行比较分析:深度学习基准测试

阅读:1

Abstract

BACKGROUND: This study aimed to perform the first multi-architecture comparison of pixel-level mandibular condyle segmentation on panoramic radiographs using transformer-based (RT-DETR), CNN-based (EfficientNet, Mask R-CNN, ConvNeXt), and YOLO-based (YOLOv9-Seg, YOLOv11-Seg) deep learning models. METHODS: A dataset of 1,300 panoramic radiographs (2,600 condyles) was retrospectively curated. Ground-truth masks were annotated by a primary radiologist and reviewed by a senior radiologist; inter-observer agreement was quantified on a blinded 10% subset (Dice: 0.92 ± 0.03). Six state-of-the-art architectures were trained and evaluated on a fixed test set. Performance was assessed using Intersection over Union (IoU), Dice Similarity Coefficient (DSC), precision, recall, and F1-score. RESULTS: All models achieved high segmentation accuracy, with DSC values ranging from 0.819 to 0.866. The transformer-based RT-DETR model showed the highest numerical DSC (0.866), IoU (0.764), and F1-score (0.866), indicating a balanced overall segmentation profile. Among the one-stage detectors, YOLOv9-Seg provided competitive results (DSC: 0.862) with high recall (0.902), outperforming CNN-based alternatives. YOLOv11-Seg showed high sensitivity but lower precision compared to other architectures. CONCLUSIONS: Deep learning enables accurate and automated condylar segmentation on panoramic radiographs. While RT-DETR showed favorable anatomical fidelity for quantitative morphometry, YOLOv9-Seg presented a viable real-time alternative. This study establishes a benchmark for selecting segmentation architectures tailored to specific clinical needs in TMJ analysis. TRIAL REGISTRATION: Not applicable.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。