Beyond the joystick: deep learning games for hand movement recovery

超越操纵杆:用于手部运动恢复的深度学习游戏

阅读:1

Abstract

INTRODUCTION: This research explores the use of Deep Learning (DL) techniques for hand and gesture recognition to support hand rehabilitation programs. The primary objective is to enhance cognitive function and hand-eye coordination through gamified therapeutic exercises that track and respond to hand gestures in real time. METHODS: Pre-trained Convolutional Neural Network (CNN) models were employed for hand recognition using Google's open-source MediaPipe Library. Four classic arcade games-Pong, Tetris, Fruit Ninja, and a Virtual Keyboard-were redeveloped as gesture-controlled rehabilitation tools within a web-based interface built using the Phaser.js framework. A score-based system was implemented to track user performance and progress. Usability was evaluated using the System Usability Scale (SUS), and statistical validation was performed with a one-sample t-test against the industry benchmark. RESULTS: Data collected from 15 participants demonstrated consistent gesture recognition accuracy and stable game control performance. The SUS evaluation indicated favorable user responses, with usability scores exceeding the benchmark threshold, suggesting that participants found the system intuitive and engaging. DISCUSSION: The study confirms the feasibility of using monocular camera-based computer vision for hand rehabilitation. Compared to existing low-cost rehabilitation tools, the proposed system provides an accessible, interactive, and affordable alternative. Integrating gesture recognition with gamified interfaces effectively supports dexterity recovery and motivates users through engaging gameplay. These findings establish a foundation for future development of AI-based rehabilitation platforms using standard camera devices.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。