• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Li, Mengran (Li, Mengran.) | Chen, Junzhou (Chen, Junzhou.) | Li, Bo (Li, Bo.) | Zhang, Yong (Zhang, Yong.) | Zhang, Ronghui (Zhang, Ronghui.) | Gong, Siyuan (Gong, Siyuan.) | Ma, Xiaolei (Ma, Xiaolei.) | Tian, Zhihong (Tian, Zhihong.)

Indexed by:

Scopus SCIE

Abstract:

Temporal dynamic graphs (TDGs), representing the dynamic evolution of entities and their relationships over time with intricate temporal features, are widely used in various real-world domains. Existing methods typically rely on mainstream techniques such as transformers and graph neural networks (GNNs) to capture the spatiotemporal information of TDGs. However, despite their advanced capabilities, these methods often struggle with significant computational complexity and limited ability to capture temporal dynamic contextual relationships. Recently, a new model architecture called mamba has emerged, noted for its capability to capture complex dependencies in sequences while significantly reducing computational complexity. Building on this, we propose a novel method, TDG-mamba, which integrates mamba for TDG learning. TDG-mamba introduces deep semantic spatiotemporal embeddings into the mamba architecture through a specially designed spatiotemporal prior tokenization module (SPTM). Furthermore, to better leverage temporal information differences and enhance the modeling of dynamic changes in graph structures, we separately design a bidirectional mamba and a directed GNN for improved spatiotemporal embedding learning. Link prediction experiments on multiple public datasets demonstrate that our method delivers superior performance, with an average improvement of 5.11% over baseline methods across various settings.

Keyword:

mamba model graph neural networks Transportation Long short term memory link prediction Computer architecture Spatiotemporal phenomena Convolution Dynamic temporal graph Computational modeling Semantics spatiotemporal prior tokenization module Graph neural networks Representation learning Transformers

Author Community:

  • [ 1 ] [Li, Mengran]Sun Yat Sen Univ, Sch Intelligent Syst Engn, Guangdong Key Lab Intelligent Transportat Syst, Guangzhou 510275, Guangdong, Peoples R China
  • [ 2 ] [Chen, Junzhou]Sun Yat Sen Univ, Sch Intelligent Syst Engn, Guangdong Key Lab Intelligent Transportat Syst, Guangzhou 510275, Guangdong, Peoples R China
  • [ 3 ] [Zhang, Ronghui]Sun Yat Sen Univ, Sch Intelligent Syst Engn, Guangdong Key Lab Intelligent Transportat Syst, Guangzhou 510275, Guangdong, Peoples R China
  • [ 4 ] [Li, Bo]Beijing Univ Technol, Beijing Inst Artificial Intelligence, Beijing 100124, Peoples R China
  • [ 5 ] [Zhang, Yong]Beijing Univ Technol, Beijing Inst Artificial Intelligence, Beijing 100124, Peoples R China
  • [ 6 ] [Gong, Siyuan]Changan Univ, Sch Informat & Engn, Xian 710064, Peoples R China
  • [ 7 ] [Ma, Xiaolei]Beihang Univ, Sch Transportat Sci & Engn, Key Lab Intelligent Transportat Technol & Syst, Beijing 100191, Peoples R China
  • [ 8 ] [Tian, Zhihong]Guangzhou Univ, Cyberspace Inst Adv Technol, Guangzhou 510006, Peoples R China

Reprint Author's Address:

  • [Zhang, Ronghui]Sun Yat Sen Univ, Sch Intelligent Syst Engn, Guangdong Key Lab Intelligent Transportat Syst, Guangzhou 510275, Guangdong, Peoples R China;;

Show more details

Related Keywords:

Related Article:

Source :

IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS

ISSN: 2329-924X

Year: 2024

5 . 0 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 0

Affiliated Colleges:

Online/Total:1679/10906551
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.