Lightweight and efficient skeleton-based sports activity recognition with ASTM-Net

基于骨骼的轻量高效运动活动识别系统,符合 ASTM-Net 标准

阅读:1

Abstract

Human Activity Recognition (HAR) plays a pivotal role in video understanding, with applications ranging from surveillance to virtual reality. Skeletal data has emerged as a robust modality for HAR, overcoming challenges such as noisy backgrounds and lighting variations. However, current Graph Convolutional Network (GCNN)-based methods for skeletal activity recognition face two key limitations: (1) they fail to capture dynamic changes in node affinities induced by movements, and (2) they overlook the interplay between spatial and temporal information critical for recognizing complex actions. To address these challenges, we propose ASTM‑Net, an Activity‑aware SpatioTemporal Multi‑branch graph convolutional network comprising two novel modules. First, the Activity‑aware Spatial Graph convolution Module (ASGM) dynamically models Activity‑Aware Adjacency Graphs (3A‑Graphs) by fusing a manually initialized physical graph, a learnable graph optimized end‑to‑end, and a dynamically inferred, activity‑related graph-thereby capturing evolving spatial affinities. Second, we introduce the Temporal Multi‑branch Graph convolution Module (TMGM), which employs parallel branches of channel‑reduction, dilated temporal convolutions with varied dilation rates, pooling, and pointwise convolutions to effectively model both fine‑grained and long‑range temporal dependencies. This multi‑branch design not only addresses diverse action speeds and durations but also maintains parameter efficiency. By integrating ASGM and TMGM, ASTM‑Net jointly captures spatial-temporal mutualities with significantly reduced computational cost. Extensive experiments on NTU‑RGB + D, NTU‑RGB + D 120, and Toyota Smarthome demonstrate ASTM‑Net's superiority: it outperforms DualHead‑Net‑ALLs by 0.31% on NTU‑RGB + D X‑Sub and surpasses SkateFormer by 2.22% on Toyota Smarthome Cross‑Subject; it reduces parameters by 51.9% and FLOPs by 49.7% compared to MST‑GCNN‑ALLs while improving accuracy by 0.82%; and under 30% random node occlusion, it achieves 86.94% accuracy-3.49% higher than CBAM‑STGCN.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。