Federated spatial-temporal traffic forecasting with VMD-enhanced graph attention and LSTM

基于VMD增强的图注意力机制和LSTM的联邦时空交通流量预测

阅读:1

Abstract

Accurate spatiotemporal demand forecasting in distributed environments is challenging because of data heterogeneity, non-stationarity across clients, and privacy constraints. Traditional federated learning approaches often suffer from poor performance when global model updates do not align with local data distribution. This study proposes a novel VMD-structured LSTM-DSTGCRN with GAT framework with a Client-Side Validation (CSV) mechanism to address these challenges. Variational Mode Decomposition (VMD) is applied locally to decompose raw broadband demand signals into Intrinsic Mode Functions (IMFs), isolating characteristic frequency components and reducing cross-frequency interference. The decomposed signals are processed using an LSTM-MultiHead Attention-AGCRN backbone to jointly capture temporal dependencies and adaptive spatial correlations with Graph Attention Networks (GATs). In the federated setting, the CSV enables the selective integration of aggregated global parameters at the module level, allowing clients to retain locally optimal parameters while adopting beneficial global updates. Experimental results on multimodal transport demand datasets demonstrate that implemented approach achieves higher prediction accuracy, faster convergence, and improved robustness compared with baseline federated graph learning models. The proposed framework provides an effective, privacy-preserving solution for nonstationary heterogeneous spatiotemporal forecasting tasks. The simulation results demonstrate that the proposed model significantly improves the accuracy compared to the baseline model by reducing the MAE by 28% in centralized models. Furthermore, in federated learning setup, the MAE decreases by 40.6% and RMSE by 20.1%.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。