• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Hou, Ruotian (Hou, Ruotian.) | Zhu, Wenjun (Zhu, Wenjun.) | Zhu, Cui (Zhu, Cui.)

Indexed by:

EI Scopus

Abstract:

Knowledge Graph Completion (KGC) has been an active research topic in recent years, which is the task of predicting missing links based on known triples of knowledge graphs. Some recent work has shown that graph neural networks (GNNs) using graph structure can perform well on KGC. These models learn information from entities and relations within the subject's neighborhood and update the representation through a message passing mechanism. However, existing GNN models rarely include the modeling of relational information, and they tend to represent and learn nodes through complex networks, ignoring the underlying semantic information between relations. In this work, we propose a global relationship-assisted graph attention network. It not only models entities but also builds directed graph structures and updates the representation of relations between different relations. Specifically, the strongly correlated neighboring relations are identified for aggregation by an attention function based on the information and spatial domains. We also use a learnable nonlinear function to activate the attention values, allowing the model to adaptively aggregate information. Experiments show that GRA-GAT has a very advanced performance on link prediction tasks. © 2022 IEEE.

Keyword:

Semantics Graph neural networks Directed graphs Graphic methods Complex networks Knowledge graph Message passing

Author Community:

  • [ 1 ] [Hou, Ruotian]Beijing University of Technology, College of Computer Science, Beijing, China
  • [ 2 ] [Zhu, Wenjun]Beijing University of Technology, College of Computer Science, Beijing, China
  • [ 3 ] [Zhu, Cui]Beijing University of Technology, College of Computer Science, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2022

Page: 532-538

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 3

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 7

Affiliated Colleges:

Online/Total:965/10549046
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.