A Deep Learning Model with a Self-Attention Mechanism for Leg Joint Angle Estimation across Varied Locomotion Modes

一种具有自注意力机制的深度学习模型,用于估计不同运动模式下的腿部关节角度

阅读:1

Abstract

Conventional trajectory planning for lower limb assistive devices usually relies on a finite-state strategy, which pre-defines fixed trajectory types for specific gait events and activities. The advancement of deep learning enables walking assistive devices to better adapt to varied terrains for diverse users by learning movement patterns from gait data. Using a self-attention mechanism, a temporal deep learning model is developed in this study to continuously generate lower limb joint angle trajectories for an ankle and knee across various activities. Additional analyses, including using Fast Fourier Transform and paired t-tests, are conducted to demonstrate the benefits of the proposed attention model architecture over the existing methods. Transfer learning has also been performed to prove the importance of data diversity. Under a 10-fold leave-one-out testing scheme, the observed attention model errors are 11.50% (±2.37%) and 9.31% (±1.56%) NRMSE for ankle and knee angle estimation, respectively, which are small in comparison to other studies. Statistical analysis using the paired t-test reveals that the proposed attention model appears superior to the baseline model in terms of reduced prediction error. The attention model also produces smoother outputs, which is crucial for safety and comfort. Transfer learning has been shown to effectively reduce model errors and noise, showing the importance of including diverse datasets. The suggested joint angle trajectory generator has the potential to seamlessly switch between different locomotion tasks, thereby mitigating the problem of detecting activity transitions encountered by the traditional finite-state strategy. This data-driven trajectory generation method can also reduce the burden on personalization, as traditional devices rely on prosthetists to experimentally tune many parameters for individuals with diverse gait patterns.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。