Abstract
Hyper-relational knowledge graphs build upon traditional knowledge graphs to enhance the diversity and complexity of information representation. They achieve this by integrating multi-dimensional auxiliary information with standard triples. However, this characteristic introduces certain challenges to the task of N-ary Fact Link Prediction. Unlike binary relational knowledge representations, N-ary Facts have more complex and varied expression forms. To address the issue of insufficient utilization of heterogeneous graph structure information in existing N-ary Fact representation methods, this paper proposes an N-ary graph Transformer model. This model incorporates a new attention mechanism based on N-ary structural bias. By improving the representation of N-ary heterogeneous graphs, it more accurately identifies key associations in recommendation scenarios. Experimental validations on the JF17K, Wikipeople, and WD50K datasets demonstrate that the NAGT model outperforms comparative methods in extracting structural information. It effectively completes the knowledge graph and shows both efficiency and robustness in experiments related to the N-ary Fact Link Prediction task.