Abstract
Physical performance tests such as the 30-second Sit-to-Stand (30s-STS), Timed Up and Go (TUG), and Short Physical Performance Battery (SPPB) are widely used to assess physical function in older adults and are predictive of key health outcomes. However, their routine use in clinical practice is limited by time, resource, and personnel constraints. This study aimed to validate the automated scoring of physical performance assessments using a mobile, markerless motion capture (MMC) app compared to scoring by a certified exercise physiologist (CEP), and to quantify the rate and reasons for technology-related data loss. 228 adults (mean age = 61.6 ± 11.9 years) with at least one chronic medical condition were enrolled. Participants completed seven performance assessments: 30s-STS, TUG, and all components of the SPPB (Side-by-Side, Semi-Tandem and Tandem balance stands, 5-times Sit-to-Stand (5xSTS), and Gait Speed). All tests were scored simultaneously by a CEP and the MMC app using a Light Detection and Ranging (LiDAR)-enabled iPad. Agreement was assessed using intraclass correlation coefficients (ICCs) and weighted Cohen's kappa. Agreement between the MMC app and CEP was good to excellent for all assessments. ICCs ranged from 0.812 (Tandem Stand) to 0.995 (5xSTS). The overall SPPB score showed almost perfect agreement (κ = 0.808). Perfect agreement with no variability was observed for the Side-by-Side and Semi-Tandem balance tests. The overall tech-related data loss rate was low (3.1%), with the most common issue being poor motion tracking quality (1.3%). Automated scoring of physical performance tests using a self-contained MMC app demonstrated high agreement with expert scoring and low data loss in a cohort of participants with a range of chronic medical conditions. These findings support the use of MMC-enabled mobile applications for scalable, accessible, and objective assessment of physical function in clinical settings, with future potential for remote and asynchronous use.