Super-Resolution pedestrian re-identification method based on bidirectional generative adversarial network

基于双向生成对抗网络的超分辨率行人重识别方法

阅读:1

Abstract

In fields such as intelligent security, pedestrian re identification technology is crucial. However, in actual monitoring scenarios, low resolution images generated due to factors such as shooting distance seriously lead to loss of details and decreased recognition performance. To overcome the technical bottleneck of excessive sharpening and artifacts in traditional super-resolution methods for reconstructing pedestrian images, a super-resolution pedestrian re recognition method based on bidirectional generative adversarial networks is proposed. The core innovation of this method lies in the construction of a bidirectional adversarial network architecture that integrates forward super-resolution reconstruction and backward downsampling simulation. By introducing residual residual dense blocks and optimizing the loss function based on ESRGAN, the realism and naturalness of image reconstruction are significantly improved. The experimental results showed that the proposed method (BSRGAN ReiD) achieved leading performance on multiple public datasets: on the Urban100 dataset, its PSNR reached 34.23 and SSIM reached 0.78; The average precision (mAP) on the DukeMTMC reID and CUHK03 datasets reached 91.4% and 82.7%, respectively. In simulated monitoring scenario testing, the research method achieved a correct recognition rate of 90.2%, with both false positive and false negative rates below 7%. At the same time, it demonstrated lower computational resource consumption and faster response speed. The main contribution of the research is to provide an efficient and robust solution for solving the problem of low resolution pedestrian re identification, which has strong theoretical value and practical application potential.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。