Abstract
Knowledge graph completion (KGC) is a fundamental task for improving downstream applications like semantic search and question answering. Effective KGC requires integrating structural and description information, allowing them to complement each other's weaknesses (e.g., long-tail issues or overlooked structural knowledge). Existing work typically integrates structural and description information at the embedding level by feeding structure embeddings into pre-trained language models (PLMs) and coupling them via attention mechanisms, which ensures the complementarity. However, as many KG entities are multi-semantic, exhibiting semantics beyond descriptions in certain triplets and making PLMs struggle to learn them, and current embedding level coupling approaches fail to transfer the entity multi-semantic knowledge learned from the structure model to PLM, the integration effect can be further improved. To alleviate above issue, we propose AKD-KGC, which aims at realizing this knowledge transfer then enhancing the integration effect by adding a teaching-learning procedure based on Adaptive Knowledge Distillation during feature integration for KGC task in this work. The AKD-KGC framework integrates two features at the embedding level and use structural models to guide prediction behavior of integration model at the same time, adjusting the weight of PLM through additional supervision and enhancing its learning of entity additional semantics beyond descriptions. AKD-KGC can be applied to both transductive and inductive settings, and has achieved state-of-the-art results on a large number of datasets in both settings, demonstrating the effectiveness of our method. Our code and datasets are available at https://github.com/liqingsong1227/AKD-KGC.