Automatic detection and classification of peri-prosthetic femur fracture

自动检测和分类假体周围股骨骨折

阅读:2

Abstract

PURPOSE: Object classification and localization is a key task of computer-aided diagnosis (CAD) tool. Although there have been numerous generic deep learning (DL) models developed for CAD, there is no work in the literature to evaluate their effectiveness when utilized in diagnosing fractures in proximity of joint implants. In this work, we aim to assess the performance of existing classification systems on binary and multi-class problems (fracture types) using plain radiographs. In addition, we evaluated the performance of object detection systems using the one- and two-stage DL architectures. METHODS: A data set of 1272 X-ray images of Peri-prosthetic Femur Fracture PFF was collected. The fractures were annotated with bounding boxes and classified according to the Vancouver Classification System (type A, B, C) by two clinical specialists. Four classification models such as Densenet161, Resnet50, Inception, VGG and two object detection models such as Faster RCNN and RetinaNet were evaluated, and their performance compared. Six confusion matrix-based measures were reported to evaluate fracture classification. For localization of the fracture, Average Precision and localization accuracy were reported. RESULTS: The Resnet50 showed the best performance with [Formula: see text] accuracy and [Formula: see text] F1-score in the binary classification: fracture/normal. In addition, the Resnet50 showed [Formula: see text] accuracy in multi-classification (normal, Vancouver type A, B and C). CONCLUSIONS: A large data set of PFF images and the annotations of fracture features by two independent assessments were created to implement a DL-based approach for detecting, classifying and localizing PFFs. It was shown that this approach could be a promising diagnostic tool of fractures in proximity of joint implants.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。