Abstract
Traffic flow prediction is challenging due to its complex spatio-temporal correlations and the graph-structured nature of traffic networks. Adaptive graph construction methods have gained attention for their superiority over static graph models. However, most existing methods only adjust the graph structure during training, failing to reflect real-time dynamics in the testing phase. This limitation is particularly significant in traffic flow prediction, where data are often affected by abrupt changes and irregularities in time series. To address this, a traffic flow prediction model named Progressive Graph Convolutional Networks with Subseries Transformer (PGSFormer) is proposed, which jointly optimizes a progressive graph convolutional network and a subsequence Transformer. PGSFormer constructs a progressive adjacency matrix by learning trend similarity between nodes, integrates it with dilated causal convolution and gated recurrent units to extract temporal features, and uses parameterized cosine similarity to dynamically update edge weights in real time. Additionally, the Transformer is enhanced with a mask reconstruction task to generate context-aware subsequence representations and extracts long-term trends using stacked 1D convolutional layers. Experiments on two real-world datasets show that PGSFormer significantly outperforms existing baselines in prediction accuracy.