On-board Training Strategy for IMU-Based Real-Time Locomotion Recognition of Transtibial Amputees With Robotic Prostheses

基于惯性测量单元(IMU)的胫骨截肢患者机器人假肢实时运动识别的机载训练策略

阅读:1

Abstract

The paper puts forward an on-board strategy for a training model and develops a real-time human locomotion mode recognition study based on a trained model utilizing two inertial measurement units (IMUs) of robotic transtibial prosthesis. Three transtibial amputees were recruited as subjects in this study to finish five locomotion modes (level ground walking, stair ascending, stair descending, ramp ascending, and ramp descending) with robotic prostheses. An interaction interface was designed to collect sensors' data and instruct to train model and recognition. In this study, the analysis of variance ratio (no more than 0.05) reflects the good repeatability of gait. The on-board training time for SVM (Support Vector Machines), QDA (Quadratic Discriminant Analysis), and LDA (Linear discriminant analysis) are 89, 25, and 10 s based on a 10,000 × 80 training data set, respectively. It costs about 13.4, 5.36, and 0.067 ms for SVM, QDA, and LDA for each recognition process. Taking the recognition accuracy of some previous studies and time consumption into consideration, we choose QDA for real-time recognition study. The real-time recognition accuracies are 97.19 ± 0.36% based on QDA, and we can achieve more than 95% recognition accuracy for each locomotion mode. The receiver operating characteristic also shows the good quality of QDA classifiers. This study provides a preliminary interaction design for human-machine prosthetics in future clinical application. This study just adopts two IMUs not multi-type sensors fusion to improve the integration and wearing convenience, and it maintains comparable recognition accuracy with multi-type sensors fusion at the same time.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。