Method for building segmentation and extraction from high-resolution remote sensing images based on improved YOLOv5ds

基于改进的YOLOv5ds算法的高分辨率遥感图像分割与提取方法

阅读:1

Abstract

To address challenges in remote sensing images, such as the abundance of buildings, difficulty in contour extraction, and slow update speeds, a high-resolution remote sensing image building segmentation and extraction method based on the YOLOv5ds network structure was proposed using Gaofen-2 images. This method, named YOLOv5ds-RC, comprises three primary components: target detection, semantic segmentation, and edge optimization. In the semantic segmentation module, an upsampling and multiple convolutional layers branch out from the second feature fusion layer of the Feature Pyramid Networks (FPN), producing a category mapping image that matches the original image size. For edge optimization, a Raster compression module is incorporated at the end of the segmentation network to refine the segmentation contours. This approach enables effective segmentation of Gaofen-2 images, achieving detailed results at the individual building scale across urban areas and facilitating rapid contour optimization and extraction. Experimental results indicate that YOLOv5ds-RC achieves an accuracy of 0.8849, a recall of 0.63904, an average precision (AP) at 0.5 of 0.75863, and a mean average precision (mAP) from 0.5 to 0.95 of 0.47388. These metrics significantly surpass those of the original YOLOv5ds, which recorded values of 0.81483 for accuracy, 0.51332 for recall, 0.63552 for AP at 0.5, and 0.34922 for mAP. The algorithm effectively corrects target displacement deviations in non-orthogonal images and achieves more objective and accurate contour extraction, meeting the requirements for rapid extraction. Due to these features, YOLOv5ds-RC can further enhance fully automated rapid extraction and historical change analysis in land use change monitoring.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。