Abstract
This paper presents a comparative study for detecting the activities of daily living (ADLs) using two distinct sensor systems: the FlexTail wearable spine tracker and a camera-based pose estimation model. We developed a protocol to simultaneously record data with both systems and capture eleven activities from general movement, household, and food handling. We tested a comprehensive selection of state-of-the-art time series classification algorithms. Both systems achieved high classification performance, with average F1 scores of 0.90 for both datasets using a 1-second time window and the random dilated shapelet transform (RDST) and QUANT classifier for FlexTail and camera data, respectively. We also explored the impact of hierarchical activity grouping and found that while it improved classification performance in some cases, the benefits were not consistent across all activities. Our findings suggest that both sensor systems recognize ADLs. The FlexTail model performs better for detecting sitting and transitions, like standing up, while the camera-based model is better for activities that involve arm and hand movements.