Research on remote sensing multi-image super-resolution based on [Formula: see text]N

基于[公式:见正文]N的遥感多图像超分辨率研究

阅读:1

Abstract

Remote sensing images are essential in various fields, but their high-resolution (HR) acquisition is often limited by factors such as sensor resolution and high costs. To address this challenge, we propose the Multi-image Remote Sensing Super-Resolution with Enhanced Spatio-temporal Feature Interaction Fusion Network ([Formula: see text]N). This model is a deep neural network based on end-to-end. The main innovations of the [Formula: see text]N network model include the following aspects. Firstly, through the Attention-Based Feature Encoder (ABFE) module, the spatial features of low-resolution (LR) images are precisely extracted. Combined with the Channel Attention Block (CAB) module, global information guidance and weighting are provided for the input features, effectively strengthening the spatial feature extraction capability of ABFE. Secondly, in terms of temporal feature modeling, we designed the Residual Temporal Attention Block (RTAB). This module effectively weights k LR images of the same location captured at different times via a global residual temporal connection mechanism, fully exploiting their similarities and temporal dependencies, and enhancing the cross-layer information transmission. The ConvGRU-RTAB Fusion Module (CRFM) captures the temporal features using RTAB based on ABFE and fuses the spatial and temporal features. Finally, the Decoder module enlarges the resolution of the fused features to achieve high quality super resolution image reconstruction. The comparative experiment results show that our model achieves notable improvements in the cPSNR metric, with values of 49.69 dB and 51.57 dB in the NIR and RED bands of the PROBA-V dataset, respectively. The visual quality of the reconstructed images surpasses that of state-of-the-art methods, including TR-MISR and MAST etc.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。