• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Wang, L. (Wang, L..) | Shi, Y. (Shi, Y..) | Wang, J. (Wang, J..) | Yin, B. (Yin, B..) | Ling, N. (Ling, N..)

Indexed by:

CPCI-S EI Scopus

Abstract:

End-to-end learned image compression exploits the expressive power of nonlinear transform modules to de-correlate the spatial redundancies of image contents. Due to its long-range attention scheme, transformer-based transforms can explore more global features for better reconstruction. However, transformer modules bring in indispensable computational costs, and the coarse utilization of transformer in learned image compression cannot meet the coding efficiency. In this paper, we propose a novel graph-structured swin-transformer for learned image compression, shown in Figure 1. We assume that the global receptive field of attention map should be sparse not dense, while the local neighboring correlations must be strong.  © 2024 IEEE.

Keyword:

Author Community:

  • [ 1 ] [Wang L.]Beijing University of Technology, Faculty of Information Technology, China
  • [ 2 ] [Shi Y.]Beijing University of Technology, Faculty of Information Technology, China
  • [ 3 ] [Wang J.]Beijing University of Technology, Faculty of Information Technology, China
  • [ 4 ] [Yin B.]Beijing University of Technology, Faculty of Information Technology, China
  • [ 5 ] [Ling N.]Santa Clara University, Department of Computer Science and Engineering, United States

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 1068-0314

Year: 2024

Page: 592-

Language: English

Cited Count:

WoS CC Cited Count: 36

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 11

Affiliated Colleges:

Online/Total:852/10604791
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.