Comparison of five convolutional neural networks for predicting osteoporosis based on mandibular cortical index on panoramic radiographs

基于全景X光片下颌骨皮质指数预测骨质疏松症的五种卷积神经网络的比较

阅读:1

Abstract

OBJECTIVES: The aim of the present study was to compare five convolutional neural networks for predicting osteoporosis based on mandibular cortical index (MCI) on panoramic radiographs. METHODS: Panoramic radiographs of 744 female patients over 50 years of age were labeled as C1, C2, and C3 depending on the MCI. The data of the present study were reviewed in different categories including (C1, C2, C3), (C1, C2), (C1, C3), and (C1, (C2 +C3)) as two-class and three-class predictions. The data were separated randomly as 20% test data, and the remaining data were used for training and validation with fivefold cross-validation. AlexNET, GoogleNET, ResNET-50, SqueezeNET, and ShuffleNET deep-learning models were trained through the transfer learning method. The results were evaluated by performance criteria including accuracy, sensitivity, specificity, F1-score, AUC, and training duration. The Gradient-Weighted Class Activation Mapping (Grad-CAM) method was applied for visual interpretation of where deep-learning algorithms gather the feature from image regions. RESULTS: The dataset (C1, C2, C3) has an accuracy rate of 81.14% with AlexNET; the dataset (C1, C2) has an accuracy rate of 88.94% with GoogleNET; the dataset (C1, C3) has an accuracy rate of 98.56% with AlexNET; and the dataset (C1,(C2+C3)) has an accuracy rate of 92.79% with GoogleNET. CONCLUSION: The highest accuracy was obtained in the differentiation of C3 and C1 where osseous structure characteristics change significantly. Since the C2 score represent the intermediate stage (osteopenia), structural characteristics of the bone present behaviors closer to C1 and C3 scores. Therefore, the data set including the C2 score provided relatively lower accuracy results.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。