Abstract
Inverter overheating is a critical fault factor in rail transit systems. To address the challenges of sparse low-voltage data and high-dimensional input features, we propose a hybrid prediction framework for inverter temperature. The Random Masked Dual DCGAN (RTDG) model is introduced to enhance low-voltage data diversity, while a Gaussian Markov Random Field (GMRF) method performs dimensionality reduction by identifying key variables. To capture spatio-temporal dependencies, an enhanced Transformer architecture (STTr) is constructed, integrating state space modeling and temporal normalization. These components are fused using a weighted stacking strategy. The model is trained and validated on real-world rail transit datasets. Performance is evaluated using MSE, RMSE, and MAE metrics. Experimental results show that the proposed model outperforms conventional approaches, achieving a 4.93% improvement over single models and a 9.73% gain compared to non-augmented training. This framework supports intelligent fault prevention and contributes to the safe, efficient operation of modern rail systems.