Abstract
Road agglomerate fog seriously threatens driving safety, making real-time fog state detection crucial for implementing reliable traffic control measures. With advantages in aerial perspective and a broad field of view, UAVs have emerged as a novel solution for road agglomerate fog monitoring. This paper proposes an agglomerate fog detection method based on the fusion of SURF and optical flow characteristics. To synthesize an adequate agglomerate fog sample set, a novel network named FogGAN is presented by injecting physical cues into the generator using a limited number of field-collected fog images. Taking the region of interest (ROI) for agglomerate fog detection in the UAV image as the basic unit, SURF is employed to describe static texture features, while optical flow is employed to capture frame-to-frame motion characteristics, and a multi-feature fusion approach based on Bayesian theory is subsequently introduced. Experimental results demonstrate the effectiveness of FogGAN for its capability to generate a more realistic dataset of agglomerate fog sample images. Furthermore, the proposed SURF and optical flow fusion method performs higher precision, recall, and F1-score for UAV perspective images compared with XGBoost-based and survey-informed fusion methods.