Abstract
Traditional physiological monitoring methods in basketball games are often invasive and struggle to capture long-term fluctuations in movement quality. This study presents a framework for assessing movement quality degradation based on video-based virtual sensing and an Adaptive Graph Convolution Network (AGCN)-Mamba network. First, the framework extracts human skeleton sequences via pose estimation and transforms them into dynamic feature tensors resembling Inertial Measurement Unit (IMU) data, thereby simulating the data stream of wearable devices. The AGCN module then captures coordination relationships between non-adjacent limbs, while Mamba’s selective scanning strategy enables long-term fatigue tracking throughout a game with low computational complexity. Experimental results under a subject-disjoint protocol demonstrated a Top-1 action recognition accuracy of 94.82%, and the movement quality scores correlated strongly with expert evaluations (correlation coefficient r = 0.91). A per-frame latency of 12.08 ms further indicates the framework’s applicability in non-invasive movement monitoring. These results show that combining adaptive graph learning with linear state-space modeling allows effective extraction of deep features and efficient edge computation. The proposed method provides a solid theoretical and technical foundation for the development of next-generation, low-power, and high-precision intelligent wearable fitness monitoring systems.