Abstract
Stroke is a leading cause of long-term disability, often affecting upper-limb motor function and requiring continuous assessment. The Fugl-Meyer Assessment (FMA), though a clinical gold standard, is time-consuming and demands specialized personnel. This study presents an AI-driven, low-cost rehabilitation exergame that simultaneously provides therapy and automatically estimates upper-limb motor performance during gameplay using only a standard camera. Sixteen kinematic and spatiotemporal features were extracted from 2D hand and arm trajectories of twelve post-stroke individuals (24 limbs, 14 affected) using the MediaPipe framework. Features such as hand angle, range of motion, movement area, traveled distance, and shoulder-elbow coordination showed strong correlations with FMA scores and stratified participants by motor severity. A lightweight linear regression model achieved high predictive performance (Spearman ρ = 0.92, R² = 0.89, RMSE = 4.42) and classified severity levels with 86-93% accuracy. This interpretable approach outperformed complex machine learning models, highlighting the clinical relevance of transparent metrics embedded in gameplay. The proposed framework is sensor-free, scalable, and reproducible, offering immediate feedback while reducing clinical workload and enabling accessible digital biomarkers for telerehabilitation and remote monitoring after stroke.