A Novel Framework for Reconstruction and Imaging of Target Scattering Centers via Wide-Angle Incidence in Radar Networks

一种基于雷达网络广角入射的目标散射中心重建与成像的新框架

阅读:1

Abstract

The precise reconstruction of target scattering centers (TSCs) using sensors plays a crucial role in feature extraction and identification of non-cooperative targets. Radar sensor networks (RSNs) are well suited for this task, as they are capable of illuminating targets from multiple aspect angles and rapidly capturing reflected signals. However, the complex geometry and diverse material composition of real-world targets result in significant variations in the radar cross-section (RCS) observed at different angles. Although these RCS responses are interrelated, they exhibit considerable angular diversity. Furthermore, achieving precise spatiotemporal registration and fully coherent processing is infeasible for RSNs composed of small mobile sensor platforms, such as drone swarms. Therefore, an intelligent algorithm is required to extract and accumulate correlated and meaningful information from the target echoes received by the RSN. In this work, a novel collaborative TSC reconstruction framework for RSNs is proposed. The framework performs similarity evaluation on wide-angle high-resolution range profiles (HRRPs) to achieve adaptive angular segmentation of TSC models. It combines the expectation-maximization (EM) algorithm with an enhanced Arctic puffin optimization (EAPO) algorithm to effectively integrate echo information from the RSN in a non-coherent manner, thereby enabling accurate TSC estimation. The proposed method outperforms existing mainstream approaches in terms of spatiotemporal registration requirements, estimation accuracy, and stability. Comparative experiments on measured datasets demonstrate the robustness of the framework and its adaptability to complex target scattering characteristics, confirming its practical value.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。