Abstract
Space-based gravitational wave detection imposes extremely high requirements on displacement measurement accuracy, with its core measurement components being laser interferometers and inertial sensors. The laser interferometers detect gravitational wave signals by measuring the distance between two test masses (TMs) housed within the inertial sensors. Spatial alignment errors of the TMs relative to the laser interferometers can severely degrade the interferometric performance, primarily by significantly amplifying tilt-to-length (TTL) coupling noise and reducing interferometric efficiency. This paper presents a systematic analysis of the coupling mechanisms between TM alignment errors and TTL coupling noise. We first establish a comprehensive TTL noise model that accounts for alignment errors, then verify and analyze it through optical simulations. This research ultimately clarifies the coupling mechanisms of TM alignment errors in the context of space-borne gravitational wave missions and determines the allowable alignment tolerance specifications required to meet the gravitational wave detection sensitivity requirements. This work provides critical theoretical foundations and design guidance for the ground alignment procedures and on-orbit performance prediction of future space-based gravitational wave detection missions.