DPRM: DeBERTa-based potential relationship multi-headed self-attention joint extraction model

DPRM:基于DeBERTa的潜在关系多头自注意力联合提取模型

阅读:1

Abstract

Traditional entity-relationship joint extraction models are typically designed to address generic domain data, which limits their effectiveness when applied to domain-specific applications such as manufacturing. This study presents the DeBERTa-based Potential Relationship Multi-Headed Self-Attention Joint Extraction Model (DPRM), which has been specifically designed to enhance the accuracy and efficiency of entity-relationship extraction in manufacturing knowledge graphs. The model is comprised of three core components: a semantic representation module, a relationship extraction and entity recognition module, and a global entity pairing module. In the semantic representation module, a DeBERTa encoder is employed to train the input sentences, thereby generating word embeddings. The capture of word dependencies is achieved through the utilization of Bi-GRU and Multi-Headed Self-Attention mechanisms, which serve to enhance the overall representation of the sentence. The relationship extraction and entity recognition module is responsible for identifying potential relationships within the sentences and integrating a relational gated mechanism to minimize the interference of irrelevant information during the entity recognition process. The global entity pairing module simplifies the model's architecture by extracting potential relationships and constructing a matrix of global pairing entity pairs based on fault-specific data. The efficacy of the proposed model is validated through experiments conducted on fault datasets. The results demonstrate that the DPRM achieves superior performance, with an F1 score that surpasses that of existing models, thereby highlighting its effectiveness in the fault domain.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。