• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Pan, Shirui (Pan, Shirui.) | Luo, Linhao (Luo, Linhao.) | Wang, Yufei (Wang, Yufei.) | Chen, Chen (Chen, Chen.) | Wang, Jiapu (Wang, Jiapu.) | Wu, Xindong (Wu, Xindong.)

Indexed by:

EI Scopus SCIE

Abstract:

Large language models (LLMs), such as ChatGPT and GPT4, are making new waves in the field of natural language processing and artificial intelligence, due to their emergent ability and generalizability. However, LLMs are black-box models, which often fall short of capturing and accessing factual knowledge. In contrast, Knowledge Graphs (KGs), Wikipedia, and Huapu for example, are structured knowledge models that explicitly store rich factual knowledge. KGs can enhance LLMs by providing external knowledge for inference and interpretability. Meanwhile, KGs are difficult to construct and evolve by nature, which challenges the existing methods in KGs to generate new facts and represent unseen knowledge. Therefore, it is complementary to unify LLMs and KGs together and, simultaneously, leverage their advantages. In this article, we present a forward-looking roadmap for the unification of LLMs and KGs. Our roadmap consists of three general frameworks, namely: 1) KG-enhanced LLMs, which incorporate KGs during the pre-training and inference phases of LLMs, or for the purpose of enhancing understanding of the knowledge learned by LLMs; 2) LLM-augmented KGs, that leverage LLMs for different KG tasks such as embedding, completion, construction, graph-to-text generation, and question answering; and 3) Synergized LLMs + KGs, in which LLMs and KGs play equal roles and work in a mutually beneficial way to enhance both LLMs and KGs for bidirectional reasoning driven by both data and knowledge. We review and summarize existing efforts within these three frameworks in our roadmap and pinpoint their future research directions.

Keyword:

Natural language processing Cognition bidirectional reasoning Predictive models Knowledge graphs generative pre-training Decoding Task analysis Training knowledge graphs roadmap large language models Chatbots

Author Community:

  • [ 1 ] [Pan, Shirui]Griffith Univ, Sch Informat & Commun Technol, Nathan, Qld 4111, Australia
  • [ 2 ] [Pan, Shirui]Griffith Univ, Inst Integrated & Intelligent Syst IIIS, Nathan, Qld 4111, Australia
  • [ 3 ] [Luo, Linhao]Monash Univ, Dept Data Sci & AI, Melbourne, Vic 3800, Australia
  • [ 4 ] [Wang, Yufei]Monash Univ, Dept Data Sci & AI, Melbourne, Vic 3800, Australia
  • [ 5 ] [Chen, Chen]Nanyang Technol Univ, Nanyang 639798, Singapore
  • [ 6 ] [Wang, Jiapu]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 7 ] [Wu, Xindong]Hefei Univ Technol, Key Lab Knowledge Engn Big Data, Minist Educ China, Hefei 230002, Peoples R China
  • [ 8 ] [Wu, Xindong]Zhejiang Lab, Res Ctr Knowledge Engn, Hangzhou 310058, Peoples R China

Reprint Author's Address:

  • [Wu, Xindong]Hefei Univ Technol, Key Lab Knowledge Engn Big Data, Minist Educ China, Hefei 230002, Peoples R China;;

Show more details

Related Keywords:

Related Article:

Source :

IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING

ISSN: 1041-4347

Year: 2024

Issue: 7

Volume: 36

Page: 3580-3599

8 . 9 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 5 Unfold All

  • 2025-5
  • 2025-3
  • 2025-1
  • 2024-11
  • 2024-11

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Affiliated Colleges:

Online/Total:2617/10894999
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.