Evaluating the accuracy of automated cephalometric analysis based on artificial intelligence

评估基于人工智能的自动头影测量分析的准确性

阅读:2

Abstract

BACKGROUND: The purpose of this study was to evaluate the accuracy of automatic cephalometric landmark localization and measurements using cephalometric analysis via artificial intelligence (AI) compared with computer-assisted manual analysis. METHODS: Reconstructed lateral cephalograms (RLCs) from cone-beam computed tomography (CBCT) in 85 patients were selected. Computer-assisted manual analysis (Dolphin Imaging 11.9) and AI automatic analysis (Planmeca Romexis 6.2) were used to locate 19 landmarks and obtain 23 measurements. Mean radial error (MRE) and successful detection rate (SDR) values were calculated to assess the accuracy of automatic landmark digitization. Paired t tests and Bland‒Altman plots were used to compare the differences and consistencies in cephalometric measurements between manual and automatic analysis programs. RESULTS: The MRE for 19 cephalometric landmarks was 2.07 ± 1.35 mm with the automatic program. The average SDR within 1 mm, 2 mm, 2.5 mm, 3 and 4 mm were 18.82%, 58.58%, 71.70%, 82.04% and 91.39%, respectively. Soft tissue landmarks (1.54 ± 0.85 mm) had the most consistency, while dental landmarks (2.37 ± 1.55 mm) had the most variation. In total, 15 out of 23 measurements were within the clinically acceptable level of accuracy, 2 mm or 2°. The rates of consistency within the 95% limits of agreement were all above 90% for all measurement parameters. CONCLUSION: Automatic analysis software collects cephalometric measurements almost effectively enough to be acceptable in clinical work. Nevertheless, automatic cephalometry is not capable of completely replacing manual tracing. Additional manual supervision and adjustment for automatic programs can increase accuracy and efficiency.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。