Enhanced residual attention-based subject-specific network (ErAS-Net): facial expression-based pain classification with multiple attention mechanisms

增强型残差注意力特定对象网络(ErAS-Net):基于面部表情的疼痛分类,具有多种注意力机制

阅读:1

Abstract

The automatic detection of pain through the analysis of facial expressions is indeed one of the most critical challenges in the healthcare system. One of the significant challenges in automatic pain detection from facial expressions is the variability in how individuals express pain and other emotions through their facial deformations. This research aims to solve this issue by presenting ErAS-Net, an Enhanced Residual Attention-Based Subject-Specific Network that employs various attention mechanisms. Through transfer learning and multiple attention mechanisms, the proposed deep learning model is designed to mimic human perception of facial expressions, thereby enhancing its pain recognition ability and capturing the unique features of each individual's facial expressions based on their specific patterns. The UNBC-McMaster Shoulder Pain dataset is used to demonstrate the effectiveness of the proposed deep learning algorithm, which achieves impressive values of 98.77% accuracy for binary classification and 94.21% for four-level pain intensity classification using tenfold cross-validation. Additionally, the model attained 89.83% accuracy for binary classification with the Leave-One-Subject-Out (LOSO) validation method. To further evaluate generalizability, a cross-dataset experiment was conducted using the BioVid Heat Pain Database, where ErAS-Net achieved 78.14% accuracy for binary pain detection on unseen data without fine-tuning. The fact that this finding supports the attention mechanism and human perception is why the proposed model proves to be a powerful and reliable tool for automatic pain detection.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。