KGMP: Augmenting retrieval knowledge graph with multi-hop perceptron

KGMP:利用多跳感知器增强检索知识图谱

阅读:1

Abstract

The core challenge of Knowledge Base Question Answering (KBQA), as a bridge between natural language and structured knowledge, is to accurately map complex semantic queries into Graph Query Language (GQL). Compared with the traditional Text-to-SQL task, KBQA faces a dual challenge: the structural differences between GQL and SQL and the lack of high-order subgraph information in multi-hop inference of knowledge graphs. While existing approaches such as ChatKBQA have made progress, the limitation of subgraph scalability severely constrains multi-hop query performance. To this end, this study proposes Knowledge Graph Multi-hop Perceptron (KGMP) - a retrieval-generation framework fine-tuned based on open-source large language models, whose innovativeness is reflected in three aspects: 1. Dynamic Graph Traversal Mechanism: Through an iterative subgraph expansion strategy, KGMP effectively achieves dynamic traversal of problem oriented graphs with progressive reasoning. 2. Structured Interaction Protocol: Based on SparQL syntax, KGMP designs a lightweight interaction instruction set to build an efficient communication interface between LLM and knowledge graph. 3. Graph Structure Optimization Technique: Develop subgraph reordering algorithms and pruning strategies based on the reranker model to ensure that the subgraphs input to the LLM are both compact and semantically complete. By integrating KGMP as a retrieval module into the ChatKBQA framework and providing it with optimised multi-hop subgraph input, the experimental results show a performance improvement of 6.2% and 5.3% on the WebQSP and CWQ datasets, respectively. This study provides a new technical paradigm for deep collaboration between LLM and knowledge graph.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。