Abstract
Distributed Acoustic Sensing (DAS) has emerged as a promising tool for real-time traffic monitoring in densely populated areas. In this paper, we present a new approach that integrates DAS data with co-located, calibrated video recordings. We use YOLO-derived vehicle location and classification from video inputs as labeled data to train a detection and classification neural network that uses DAS data only. The model is applied in areas with and without video coverage. It achieves about [Formula: see text] success in detection and classification, and about [Formula: see text] false alarm rate when compared to YOLO outputs. We illustrate the model's application in monitoring a week of traffic, yielding statistical insights that could benefit future smart city developments. Our approach highlights the potential of combining fiber-optic sensors and cameras, focusing on practicality and scalability, protecting privacy, and minimizing infrastructure costs. To encourage future research, we share our datasets.