Hand gestures classification of sEMG signals based on BiLSTM-metaheuristic optimization and hybrid U-Net-MobileNetV2 encoder architecture

基于双向长短期记忆网络-元启发式优化和混合U-Net-MobileNetV2编码器架构的表面肌电信号手势分类

阅读:1

Abstract

Surface electromyography (sEMG) data has been extensively utilized in deep learning algorithms for hand movement classification. This paper aims to introduce a novel method for hand gesture classification using sEMG data, addressing accuracy challenges seen in previous studies. We propose a U-Net architecture incorporating a MobileNetV2 encoder, enhanced by a novel Bidirectional Long Short-Term Memory (BiLSTM) and metaheuristic optimization for spatial feature extraction in hand gesture and motion recognition. Bayesian optimization is employed as the metaheuristic approach to optimize the BiLSTM model's architecture. To address the non-stationarity of sEMG signals, we employ a windowing strategy for signal augmentation within deep learning architectures. The MobileNetV2 encoder and U-Net architecture extract relevant features from sEMG spectrogram images. Edge computing integration is leveraged to further enhance innovation by enabling real-time processing and decision-making closer to the data source. Six standard databases were utilized, achieving an average accuracy of 90.23% with our proposed model, showcasing a 3-4% average accuracy improvement and a 10% variance reduction. Notably, Mendeley Data, BioPatRec DB3, and BioPatRec DB1 surpassed advanced models in their respective domains with classification accuracies of 88.71%, 90.2%, and 88.6%, respectively. Experimental results underscore the significant enhancement in generalizability and gesture recognition robustness. This approach offers a fresh perspective on prosthetic management and human-machine interaction, emphasizing its efficacy in improving accuracy and reducing variance for enhanced prosthetic control and interaction with machines through edge computing integration.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。