• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Jiang, Z. (Jiang, Z..) | Feng, C. (Feng, C..) | Zhang, J. (Zhang, J..) | Bai, X. (Bai, X..)

Indexed by:

Scopus

Abstract:

The emergence of Large Language Models (LLMs) has driven the progress of deep learning. With the development of LLMs, they have gained the ability to process various types of input, including images and videos. However, there is still a significant gap in LLMs’ understanding of graph structure data and its inherent complexity. Graph Neural Networks (GNNs) are a mature model that is specifically designed as a neural network model for handling irregular graph structure data. However, the challenge is how to combine the advantages of GNNs and LLMs, not just relying on the performance of GNNs on graph structure datasets. Inspired by the progress of LLMs, This paper proposes a novel approach that integrates graph structure information with textual data, aiming to leverage the power of LLMs in understanding complex graph-based data. Our framework involves standardizing both graph structures and text datasets into a consistent length embedding, ensuring compatibility with LLM processing requirements. A lightweight converter is employed to forge links between disparate data modalities, preserving the integrity and characteristics of each original data while converting them into a unified representation. Furthermore, this framework incorporates Retrieval Augmented Thoughts (RAT) to enable the effective integration of external knowledge sources, thereby enriching the context and coherence of generated responses. Through rigorous evaluation against prevailing benchmarks, the proposed method showcases superior performance across four distinct datasets, outperforming cutting-edge GNNs models currently dominating the field. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.

Keyword:

Large Language Models Deep Learning Graph Neural Networks Retrieval Augmented Thoughts

Author Community:

  • [ 1 ] [Jiang Z.]Beijing University of Technology, Pingleyuan Street, Beijing, China
  • [ 2 ] [Feng C.]Beijing University of Technology, Pingleyuan Street, Beijing, China
  • [ 3 ] [Zhang J.]Beijing University of Technology, Pingleyuan Street, Beijing, China
  • [ 4 ] [Bai X.]Beijing University of Technology, Pingleyuan Street, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

ISSN: 0302-9743

Year: 2025

Volume: 15389 LNAI

Page: 221-233

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 12

Affiliated Colleges:

Online/Total:578/10616580
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.