• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Hu, S. (Hu, S..) | Wang, B. (Wang, B..) | Wang, J. (Wang, J..) | Ma, Y. (Ma, Y..) | Zhao, L. (Zhao, L..)

Indexed by:

EI Scopus

Abstract:

A structured semantic knowledge base called a temporal knowledge graph contains several quadruple facts that change throughout time. To infer missing facts is one of the main challenges with temporal knowledge graph, i.e., temporal knowledge graph completion (TKGC). Transformer has strong modeling abilities across a variety of domains since its self-attention mechanism makes it possible to model the global dependencies of input sequences, while few studies explore Transformer encoders for TKGC tasks. To address this problem, we propose a novel end-to-end TKGC model named Transbe-TuckERTT that adopts an encoder-decoder architecture. Specifically, t he proposed model employs the Transformer-based encoder to facilitate interaction between entities, relations, and temporal information within the quadruple to generate highly expressive embeddings. The TuckERTT decoder uses encoded embeddings to predict missing facts in the knowledge graph. Experimental results demonstrate that our proposed model outperforms several state-of-the-art TKGC methods on three public benchmark datasets, verifying the effectiveness of the self-attention mechanism in the Transformer-based encoder for capturing dependencies in the temporal knowledge graph.  © 2023 IEEE.

Keyword:

Temporal knowledge graph Knowledge graph embedding Knowledge graph Knowledge graph completion

Author Community:

  • [ 1 ] [Hu S.]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 2 ] [Wang B.]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 3 ] [Wang J.]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 4 ] [Ma Y.]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 5 ] [Zhao L.]Beijing University of Technology, Faculty of Information Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2023

Page: 443-448

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 9

Affiliated Colleges:

Online/Total:524/10583666
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.