• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhou, B. (Zhou, B..) | Wang, S. (Wang, S..) | Xiao, S. (Xiao, S..)

Indexed by:

EI Scopus

Abstract:

Crowd counting algorithms play an important role in the field of public safety management. Most of the current mainstream crowd counting methods are based on deep convolutional neural networks (CNNs), which use multi-column or multi-scale convolutional structures to obtain contextual information in images to compensate for the impact of perspective distortion on counting results. However, due to the locally connected nature of convolution, this method cannot obtain enough global context, which often leads to misidentification in complex background regions, which affects the accuracy of counting. To solve this problem. First, we design a double recursive sparse self-attention module, which can better obtain long-distance dependency information and improve the problem of background false detection on the basis of reducing the amount of computation and parameters. Secondly, we design a Transformer structure based on feature pyramid as the feature extraction module of the crowd counting algorithm, which effectively improves the algorithm’s ability to extract global information. The experimental results on public datasets show that our proposed algorithm outperforms the current mainstream crowd counting methods, and effectively improves the background false detection problem of complex scene images. © 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.

Keyword:

Crowd density estimation Crowd count Transformer Deep learning Attention mechanism

Author Community:

  • [ 1 ] [Zhou B.]Beijing University of Technology, Beijing, China
  • [ 2 ] [Wang S.]Beijing University of Technology, Beijing, China
  • [ 3 ] [Xiao S.]Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

ISSN: 0302-9743

Year: 2022

Volume: 13534 LNCS

Page: 722-734

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 7

Affiliated Colleges:

Online/Total:543/10585922
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.