Machine Learning Methods to Track Dynamic Facial Function in Facial Palsy

利用机器学习方法追踪面瘫患者的动态面部功能

阅读:1

Abstract

OBJECTIVE: For patients with facial palsy, the wait for return of facial function and resulting vision risk from poor eye closure, difficulty speaking and eating from flaccid oral sphincter muscles, and psychological morbidity from the inability to smile or express emotions can be devastating. There are limited methods to assess ongoing facial nerve regeneration: clinicians rely on subjective descriptions, imprecise scales, and static photographs to evaluate facial functional recovery. We propose a more precise evaluation of dynamic facial function through video-based machine learning analysis to facilitate a better understanding of the sometimes subtle onset of facial nerve recovery and improve guidance for facial reanimation surgery. METHODS: We present machine learning methods employing likelihood ratio tests, optimal transport theory, and Mahalanobis distances to: 1) assess the use of defined facial landmarks for binary classification of different facial palsy types; 2) identify regions of asymmetry and potential palsy during specific facial cues; and 3) quantify palsy severity and map it directly to widely used clinical scores, offering clinicians an objective way to assess facial nerve function. RESULTS: Our results demonstrate that video analysis provides a significantly more accurate and detailed assessment of facial movements than previously reported. CONCLUSIONS: Our work allows for precise classification of facial palsy types, identification of asymmetric regions, and assessment of palsy severity. SIGNIFICANCE: This project enables clinicians to have more accurate and timely information to make decisions for facial reanimation surgery, which will have drastic consequences on the quality of life for affected patients.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。