Abstract
This study aims to assess hand function recovery in stroke patients during the mid-to-late Brunnstrom stages and to encourage active participation in rehabilitation exercises. To this end, a deep residual network (ResNet) integrated with Focal Loss is employed for gesture recognition, achieving a Macro F1 score of 91.0% and a validation accuracy of 90.9%. Leveraging the millimetre-level precision of Leap Motion 2 hand tracking, a mapping relationship for hand skeletal joint points was established, and a static assessment gesture data set containing 502,401 frames was collected through analysis of the FMA scale. The system implements an immersive augmented reality interaction through the Unity development platform; C# algorithms were designed for real-time motion range quantification. Finally, the paper designs a rehabilitation system framework tailored for home and community environments, including system module workflows, assessment modules, and game logic. Experimental results demonstrate the technical feasibility and high accuracy of the automated system for assessment and rehabilitation training. The system is designed to support stroke patients in home and community settings, with the potential to enhance rehabilitation motivation, interactivity, and self-efficacy. This work presents an integrated research framework encompassing hand modelling and deep learning-based recognition. It offers the possibility of feasible and economical solutions for stroke survivors, laying the foundation for future clinical applications.