A Method for Pedestrian Trajectory Prediction Using INS-GNSS Wearable Devices

一种利用INS-GNSS可穿戴设备进行行人轨迹预测的方法

阅读:1

Abstract

Driven by advancements in artificial intelligence technology, pedestrian trajectory prediction is shifting from traditional machine learning methods toward autonomous decision-making frameworks based on neural networks. However, the spatiotemporal uncertainty of pedestrian movement results in low accuracy of existing prediction models. To address this issue, we propose a multi-source perception fusion system based on INS-GNSS wearable devices. By integrating high-precision inertial measurement units (IMUs) and multi-mode global navigation satellite systems (GNSS), we enhance localization and prediction accuracy. For localization, we introduce a Gait Adaptive UKF (Gait-AUKF) that identifies pedestrian gait patterns and motion states by fusing multi-sensor data. An adaptive algorithm effectively suppresses trajectory drift and improves tracking accuracy. For trajectory prediction, we propose a pedestrian trajectory prediction framework based on a multi-source fusion attention mechanism. A GRU encoder extracts pedestrian trajectory features from historical motion data. An attention mechanism assigns varying weights to trajectory features across different scales. An LSTM decoder and A* path planning algorithm constrain spatiotemporal paths to generate future pedestrian trajectories. Experimental results demonstrate that compared to UKF and AKF, the Gait-AUKF reduces eastward error by 30%, northward error by 26.27%, and vertical error by 49.08%. The complete prediction framework achieves a 68.54% reduction in average position error (APE) and a 70.42% reduction in direction error (DE) compared to LSTM and Transformer models. Ablation experiments demonstrate that the integrated Gait-AUKF algorithm and A* path planning algorithm enhance model decision performance. After incorporating these algorithms, the model's ADE decreased by 68.49% and FDE by 71.86%.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。