Abstract
Space surveillance radar systems transmit and receive electromagnetic (EM) waves to detect and track space objects across a wide range of elevation angles, including low angles between 30∘ and 60∘ . At these angles, ionospheric effects such as refraction, attenuation, and Faraday rotation (FR) become significant. FR rotates the polarization plane of EM waves, causing polarization mismatch and reducing received signal power. Accurate estimation of the Faraday rotation angle (FRA) is essential to mitigate such losses in radar performance. This study proposes a method for precisely calculating the FRA of EM waves propagating through the ionosphere. The approach is based on the Appleton Hartree equation, offering a complete solution for arbitrary propagation angles and frequencies. It incorporates three-dimensional electron density and geomagnetic field data along the wave's path, enabling accurate modeling of ionospheric effects. The estimated FRA is then used to analyze the corresponding polarization loss characteristics at the radar receiver. This methodology supports enhanced prediction of signal degradation in space surveillance radar systems and provides a more realistic modeling framework for ionospheric influence, especially under low-angle propagation conditions.