Abstract
Exposure to Particulate Matter (PM) is linked to respiratory and cardiovascular diseases, certain types of cancer, and accounts for approximately seven million premature deaths globally. While governments and organizations have implemented various strategies for Air Quality (AQ) such as the deployment of Air Quality Monitoring Networks (AQMN), these networks often suffer from limited spatial coverage and involve high installation and maintenance costs. Consequently, the implementation of networks based on Low-Cost Sensors (LCS) has emerged as a viable alternative. Nevertheless, LCS systems have certain drawbacks, such as lower reading precision, which can be mitigated through specific calibration models and methods. This paper presents the results and conclusions derived from simultaneous PM(10) and PM(2.5) monitoring comparisons between LCS nodes and a T640X reference sensor. Additionally, Relative Humidity (RH), temperature, and absorption flow measurements were collected via an Automet meteorological station. The monitoring equipment was installed at the Faculty of Environment of the Universidad Distrital in Bogotá. The LCS calibration process began with data preprocessing, which involved filtering, segmentation, and the application of FastDTW. Subsequently, calibration was performed using a variety of models, including two statistical approaches, three Machine Learning algorithms, and one Deep Learning model. The findings highlight the critical importance of applying FastDTW during preprocessing and the necessity of incorporating RH, temperature, and absorption flow factors to enhance accuracy. Furthermore, the study concludes that Random Forest and XGBoost offered the highest performance among the methods evaluated. While satellites map city-wide patterns and MAX-DOAS enables hourly source attribution, our calibrated LCS network supplies continuous, street-scale data at low CAPEX/OPEX-forming a practical backbone for sustained micro-scale monitoring in Bogotá.