MMG-Based Motion Segmentation and Recognition of Upper Limb Rehabilitation Using the YOLOv5s-SE

基于肌动图的上肢康复运动分割与识别:YOLOv5s-SE

阅读:1

Abstract

Mechanomyography (MMG) is a non-invasive technique for assessing muscle activity by measuring mechanical signals, offering high sensitivity and real-time monitoring capabilities, and it has many applications in rehabilitation training. Traditional MMG-based motion recognition relies on feature extraction and classifier training, which require segmenting continuous actions, leading to challenges in real-time performance and segmentation accuracy. Therefore, this paper proposes an innovative method for the real-time segmentation and classification of upper limb rehabilitation actions based on the You Only Look Once (YOLO) algorithm, integrating the Squeeze-and-Excitation (SE) attention mechanism to enhance the model's performance. In this paper, the collected MMG signals were transformed into one-dimensional time-series images. After image processing, the training set and test set were divided for the training and testing of the YOLOv5s-SE model. The results demonstrated that the proposed model effectively segmented isolated and continuous MMG motions while simultaneously performing real-time motion category prediction and outputting results. In segmentation tasks, the base YOLOv5s model achieved 97.9% precision and 98.0% recall, while the improved YOLOv5s-SE model increased precision to 98.7% (+0.8%) and recall to 98.3% (+0.3%). Additionally, the model demonstrated exceptional accuracy in predicting motion categories, achieving an accuracy of 98.9%. This method realizes the automatic segmentation of time-domain motions, avoids the limitations of manual parameter adjustment in traditional methods, and simultaneously enhances the real-time performance of MMG motion recognition through image processing, providing an effective solution for motion analysis in wearable devices.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。