Using the improved YOLOv11 model to enhance computer vision applications for building crack detection algorithms

利用改进的YOLOv11模型增强计算机视觉应用,以构建裂纹检测算法

阅读:1

Abstract

With the acceleration of urbanization, building crack detection has become an important task for ensuring the safety of structures. Traditional detection methods face challenges such as low efficiency and high error rates. Deep learning, particularly YOLO series algorithms, has become an effective technology for addressing these issues. This study introduces an enhanced model derived from the YOLOv11 algorithm, designed to improve both the precision and real-time efficiency of building crack detection. The improved YOLOv11 model introduces innovative designs, such as the C3K2-SG module, FPSConv module, and Inner_MPDIoU loss function, which significantly improve crack feature extraction, fine-grained feature fusion, and small target detection accuracy. The C3K2-SG module improves crack detection in complex backgrounds, the FPSConv module optimizes the detection of cracks at various scales, and the Inner_MPDIoU loss function improves the localization of small crack targets. Experimental results show that the enhanced model reaches a detection accuracy (mAP@0.5) of 88.6%, marking a 4.6% increase over the original YOLOv11. Furthermore, the precision and recall rates increased by 3.5% and 7.6%, respectively. The model also provides a detailed comparative analysis of four typical crack types (vertical cracks, horizontal cracks, multi-level cracks, and complex cracks). Compared to YOLOv11, the improved model demonstrates significant advantages in detecting small targets and complex cracks, with an average improvement of 0.12 in detection accuracy. This model offers an efficient and accurate solution for building crack detection and has broad application prospects, especially in intelligent inspection and building safety monitoring.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。