Automated Industrial Composite Fiber Orientation Inspection Using Attention-Based Normalized Deep Hough Network

基于注意力机制的归一化深度霍夫网络的自动化工业复合纤维取向检测

阅读:1

Abstract

Fiber-reinforced composites (FRC) are widely used in various fields due to their excellent mechanical properties. The mechanical properties of FRC are significantly governed by the orientation of fibers in the composite. Automated visual inspection is the most promising method in measuring fiber orientation, which utilizes image processing algorithms to analyze the texture images of FRC. The deep Hough Transform (DHT) is a powerful image processing method for automated visual inspection, as the "line-like" structures of the fiber texture in FRC can be efficiently detected. However, the DHT still suffers from sensitivity to background anomalies and longline segments anomalies, which leads to degraded performance of fiber orientation measurement. To reduce the sensitivity to background anomalies and longline segments anomalies, we introduce the deep Hough normalization. It normalizes the accumulated votes in the deep Hough space by the length of the corresponding line segment, making it easier for DHT to detect short, true "line-like" structures. To reduce the sensitivity to background anomalies, we design an attention-based deep Hough network (DHN) that integrates attention network and Hough network. The network effectively eliminates background anomalies, identifies important fiber regions, and detects their orientations in FRC images. To better investigate the fiber orientation measurement methods of FRC in real-world scenarios with various types of anomalies, three datasets have been established and our proposed method has been evaluated extensively on them. The experimental results and analysis prove that the proposed methods achieve the competitive performance against the state-of-the-art in F-measure, Mean Absolute Error (MAE), Root Mean Squared Error (RMSE).

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。