Abstract
This paper presents a robust and adaptive visual servoing-based landing control method for unmanned aerial vehicles (UAVs) equipped with a three-axis gimbal camera. To address the limitations of fixed-camera configurations, the proposed approach integrates pixel-level field-of-view (FOV) constraints and leverages the gimbal's agility for enhanced visual tracking. The landing task is formulated as a constrained image-based control problem, where tracking errors of image features are rigorously bounded using prescribed performance functions. A velocity observer is incorporated to estimate the time-varying motion of the landing platform in real time, enabling accurate autonomous landing without relying on external communication or infrastructure. Lyapunov-based stability analysis confirms the theoretical soundness of the control strategy. Simulation results validate the effectiveness and robustness of the proposed method, demonstrating improved accuracy, adaptability, and practical applicability in UAV landing scenarios.