Indexed by:
Abstract:
The emergence of Large Language Models (LLMs) has driven the progress of deep learning. With the development of LLMs, they have gained the ability to process various types of input, including images and videos. However, there is still a significant gap in LLMs’ understanding of graph structure data and its inherent complexity. Graph Neural Networks (GNNs) are a mature model that is specifically designed as a neural network model for handling irregular graph structure data. However, the challenge is how to combine the advantages of GNNs and LLMs, not just relying on the performance of GNNs on graph structure datasets. Inspired by the progress of LLMs, This paper proposes a novel approach that integrates graph structure information with textual data, aiming to leverage the power of LLMs in understanding complex graph-based data. Our framework involves standardizing both graph structures and text datasets into a consistent length embedding, ensuring compatibility with LLM processing requirements. A lightweight converter is employed to forge links between disparate data modalities, preserving the integrity and characteristics of each original data while converting them into a unified representation. Furthermore, this framework incorporates Retrieval Augmented Thoughts (RAT) to enable the effective integration of external knowledge sources, thereby enriching the context and coherence of generated responses. Through rigorous evaluation against prevailing benchmarks, the proposed method showcases superior performance across four distinct datasets, outperforming cutting-edge GNNs models currently dominating the field. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.
Keyword:
Reprint Author's Address:
Email:
Source :
ISSN: 0302-9743
Year: 2025
Volume: 15389 LNAI
Page: 221-233
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 12
Affiliated Colleges: