EM-YOLO: high-precision electronic component detection via multi-scale attention and dynamic feature fusion

EM-YOLO:基于多尺度注意力机制和动态特征融合的高精度电子元件检测

阅读:1

Abstract

To address the frequent missed detections and suboptimal accuracy in electronic component circuit board inspection, a target detection algorithm for electronic components, EM-YOLO, is designed with YOLOv11 as the core framework. The overall design employs a three-level optimization strategy: First, in the backbone network part, an efficient feature extraction module C3k2_EMA is designed. By incorporating the Efficient Multi-scale Attention (EMA) mechanism, it enhances the network's feature representation capability for tiny components. Second, in the neck network part, the BiSPD-FPN structure is proposed, which adds a high-resolution feature map at the P2 level. By combining the Bidirectional Feature Pyramid Network (BiFPN) with Spatial Pyramid Depthwise Convolution (SPDConv), it optimizes the multi-scale feature fusion capability and reduces detail loss during the downsampling process. Finally, in the detection head part, the Focal-DIoU loss function is introduced to optimize the bounding box regression process and improve the localization accuracy of densely arranged components. Experimental results show that, on the self-made dataset and the public dataset PCB Electronic Components Dataset, the detection accuracy (mAP@0.5) of the improved algorithm has increased by 0.5% and 3.9% respectively compared with the benchmark algorithm, while the false negative rate (FNR) has decreased by 1.3% and 4% respectively, both outperforming mainstream algorithms. Therefore, the algorithm in this paper exhibits good detection accuracy, and the problem of missed detections occurring in the detection process has also been effectively alleviated. It possesses good generalization ability and provides an effective detection scheme for component detection.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。