Automated radiographic assessment of lower limb alignment using deep learning in a data-constrained clinical setting

在数据受限的临床环境下,利用深度学习进行下肢力线自动放射影像评估

阅读:1

Abstract

BACKGROUND: A model using stacked hourglass neural network modules was developed for automated lower limb alignment measurements in full-leg frontal radiographs. The goal was to assess whether high accuracy could be achieved with training on relatively small datasets, a common constraint in clinical research. METHODS: The model first identifies joints regions of interest (ROIs), then detects anatomical landmarks for alignment measurement within each ROI. 112 bilateral radiographs (216 annotated single-leg images) were used for training and validation, 99 (186 legs) as internal test set, and 25 (49) as external test set. Performance metrics included intraclass correlation coefficient (ICCs) and mean absolute error (MAE). RESULTS: The model achieved excellent agreement (ICCs > 0.9) with manual annotations for all alignment parameters, except for the joint line convergence angle (ICC < 0.5), which also showed poor agreement between human readers (0.73). CONCLUSION: The proposed approach demonstrates high accuracy and reliability for automated lower limb alignment assessment, offering a robust solution for clinical research with limited annotated datasets.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。