Deep Learning: Predicting Environments From Short-Time Observations of Postural Balance

深度学习:基于短时姿势平衡观察预测环境

阅读:1

Abstract

OBJECTIVE: This study introduces a deep learning approach to accurately predict challenging mechanical environments that possibly cause decreasing postural stability. METHODS: Dual-axis robotic platforms were utilized to simulate various environments and collect center-of-pressure data during narrow and wide stance. A convolutional neural network (CNN) was developed to predict environmental conditions given segmented time-series balance data. Different window sizes were examined to investigate its minimal length for reliable prediction. Effectiveness of the presented CNN was additionally compared with that of conventional machine learning models. Its applicability with low sampled data or more natural stance data was then evaluated. RESULTS: The CNN achieved above 94.5% in the overall prediction accuracy even with 2.5-second length postural sway data, which cannot be achieved by traditional machine learning (ps < 0.05). Increasing data length beyond 2.5 seconds slightly improved the accuracy of CNN but substantially increased training time (60% longer). Importantly, results from averaged normalized confusion matrices revealed that CNN is much more capable of differentiating the mid-level environmental condition. Deep learning could also produce comparable performance in predicting environments even with much lower sampled data or with standing posture changed. CONCLUSION: CNN removed the burden of feature preparation and accurately predicted environments when dealing with short-length data. It also indicated potentials to real life applications. SIGNIFICANCE: This study contributes to the advancement of wearable devices and human interactive robots (e.g., exoskeletons and prostheses) by predicting environmental contexts and preventing potential falls.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。