An efficient surface electromyography-based gesture recognition algorithm based on multiscale fusion convolution and channel attention.

阅读:5
作者:Jiang Bin, Wu Hao, Xia Qingling, Xiao Hanguang, Peng Bo, Wang Li, Zhao Yun
In the field of rehabilitation, although deep learning have been widely used in multitype gesture recognition via surface electromyography (sEMG), their higher algorithmic complexity often leads to low computationally inefficient, which compromise their practicality. To achieve more efficient multitype recognition, We propose the Residual-Inception-Efficient (RIE) model, which integrates Inception and efficient channel attention (ECA). The Inception, which is a multiscale fusion convolutional module, is adopted to enhance the ability to extract sEMG features. It uses fast dimensionality reduction, asymmetric convolution decomposition, and pooling to suppress the accumulation of parameters, then reducing the algorithmic complexity; The ECA is adopted to reweight the output features of Inception in different channels, enabling the RIE model to focus on information that is more relevant to gestures. 52-, 49-, and 52-class gesture recognition experiments are conducted on NinaPro DB1, DB3, and DB4 datasets, which contain 14,040, 3234, and 3120 gesture samples, respectively. RIE model proposed in this study achieves accuracies of 88.27%, 69.52%, and 84.55% on the three datasets, exhibiting excellent recognition accuracy and strong generalization ability. Moreover, this method reduces the algorithmic complexity from both spatial and temporal aspects, rendering it smaller in size and faster in computation compared to other lightweight algorithms. Therefore, the proposed RIE model possesses both lightweight computational requirements and reliable performance, providing an efficient deep learning method for gesture recognition based on sEMG.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。