Abstract
This article discusses the creation of accurate environmental models using sensor fusion data. While RGBD-cameras and LiDAR are commonly used for data collection, both have limitations in practical applications; RGBD-cameras are sensitive to lighting and range, while LiDAR struggles with specific object materials and vertical sensing. To address these issues, we propose a lightweight, field-deployable depth-bias correction pipeline. Rather than proposing a novel mapping backend, our contribution lies in utilizing high-precision planar LiDAR data to dynamically correct the non-linear depth drift of low-cost RGBD sensors. This corrected data is then integrated into a standard probabilistic 2D grid map for autonomous systems. Experimental results demonstrate that the proposed method reduces depth estimation errors significantly—from 0.604 m to 0.340 m at a range of 2.16 m—and successfully detects hollow obstacles that standard 2D LiDAR scans miss. Ultimately, the study presents a reproducible pipeline for sensor fusion that enhances map quality in indoor logistics environments.