Abstract
Multivariate time series anomaly detection is a critical task in modern engineering, with applications spanning environmental monitoring, network security, and industrial systems. While reconstruction-based methods have shown promise, they often suffer from overfitting and fail to adequately distinguish between normal and anomalous data, limiting their generalization capabilities. To address these challenges, we propose the AOST model, which integrates adversarial learning with an outlier suppression mechanism within a Transformer framework. The model introduces an outlier suppression attention mechanism to enhance the distinction between normal and anomalous data points, thereby improving sensitivity to deviations. Additionally, a dual-decoder generative adversarial architecture is employed to enforce consistent data distribution learning, enhancing robustness and generalization. A novel anomaly scoring strategy based on longitudinal differences further refines detection accuracy. Extensive experiments on three public datasets-SWaT, WADI, SMAP, and PSM-demonstrate the model's superior performance, achieving an average F1 score of 88.74%, which surpasses existing state-of-the-art methods. These results underscore the effectiveness of AOST in advancing multivariate time series anomaly detection.