Abstract
Accurate and stable extrinsic calibration is the foundation of high-quality fusion sensing and positioning of camera and Light Detection and Ranging (LiDAR). However, traditional targetless calibration methods suffer from limitations such as poor scene adaptability and unstable convergence, which significantly restrict calibration accuracy and robustness in complex environments. Aiming at solving those problems, we propose an online cascade-optimization-based extrinsic calibration method of combining motion trajectory alignment and edge feature alignment. In the initial calibration stage, a hand-eye calibration algorithm is designed by minimizing the residual discrepancies between camera odometry and LiDAR odometry sequences. It establishes a robust initialization for subsequent optimization. Then, in order to extract robust edge line features from sparse point clouds, we employ depth difference and planar edges of point clouds in the optimization process. Subsequently, principal component analysis (PCA) is applied to compute the principal direction of the extracted line features, enabling a decoupled optimization scheme that accounts for directional observability. This approach effectively mitigates the adverse effects of uneven environmental feature distributions. Experimental validation on typical urban datasets demonstrates the method's generalizability and competitive accuracy: rotational parameter errors are constrained within 0.25°, and translational errors are maintained below 0.05 m. This affirms the method's suitability for high-accuracy engineering applications.