Abstract
Knowledge graph (KG) plays an increasingly important role in recommender systems. Recently, Graph Convolutional Network (GCN) and Graph Attention Network (GAT) based model has gradually become the theme of Collaborative Knowledge Graph (CKG). However, recommender systems encounter long-tail distributions in large-scale graphs. The inherent data sparsity concentrates relationships within few entities, generating uneven embedding distributions. Contrastive Learning (CL) counters data sparsity in recommender systems by extracting general representations from raw data, enhancing long-tail distribution handling. Nonetheless, traditional graph augmentation techniques have proven to be of limited use in CL-based recommendations. Accordingly, this paper proposes Noise Augmentations Knowledge Graph Attention Contrastive Learning method (NA-KGACL) for Recommendation. The proposed method establishes a multi-level contrastive framework by integrating CL with Knowledge-GAT, where node representations are refined via projection heads and shuffled batch normalization. Additionally, it replaces ineffective graph augmentation methods with a simple yet powerful noise-augmented algorithm to generate contrastive learning views. Experimental results on three large-scale and real-world datasets indicate that NA-KGACL improves the learned representations, which results in increased recommendation accuracy and more efficient training.