There are complex graph structures and rich textual information on social networks. Text provides important information for various tasks, while graph structures offer multilevel context for the semantics of the text. Contemporary researchers tend to represent these kinds of data by text-attributed graphs (TAGs). Most TAG-based representation learning methods focus on designing frameworks that convey graph structures to large language models (LLMs) to generate semantic embeddings for downstream graph neural networks (GNNs). However, these methods only provide text attributes for nodes, which fails to capture the multilevel context and leads to the loss of valuable information. To tackle this issue, we introduce the Multilevel Context Learner (MCL) model, which leverages multilevel context on social networks to enhance LLMs' semantic embedding capabilities. We model the social network as a multilevel context textual-edge graph (MC-TEG), effectively capturing both graph structure and semantic relationships. Our MCL model leverages the reasoning capabilities of LLMs to generate semantic embeddings by integrating these multilevel contexts. The tailored bidirectional dynamic graph attention layers are introduced to further distinguish the weight information. Experimental evaluations on six real social network datasets show that the MCL model consistently outperforms all baseline models. Specifically, the MCL model achieves prediction accuracies of 77.98%, 77.63%, 74.61%, 76.40%, 72.89%, and 73.40%, with absolute improvements of 9.04%, 9.19%, 11.05%, 7.24%, 6.11%, and 9.87% over the next best models. These results demonstrate the effectiveness of the proposed MCL model.
Multilevel Context Learning with Large Language Models for Text-Attributed Graphs on Social Networks.
阅读:5
作者:Cai Xiaokang, Gong Ruoyuan, Jiang Hao
| 期刊: | Entropy | 影响因子: | 2.000 |
| 时间: | 2025 | 起止号: | 2025 Mar 10; 27(3):286 |
| doi: | 10.3390/e27030286 | ||
特别声明
1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。
2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。
3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。
4、投稿及合作请联系:info@biocloudy.com。
