• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhang, Zihan (Zhang, Zihan.) | Xie, Xuesong (Xie, Xuesong.) | Zhang, Xiaoling (Zhang, Xiaoling.)

Indexed by:

EI

Abstract:

Deep neural networks have been widely applied across various domains, but their numerous parameters and high computational demands limit their practical usage scenarios. To address this issue, this paper introduces a convolutional neural network compression method based on multi-factor channel pruning. By integrating scaling and shifting factors from batch normalization layers, a multi-factor channel salience metric is proposed to measure channel importance. By removing redundant channels within the convolutional neural network, a compressed model is obtained. On the CIFAR-10 dataset, we pruned 93.06% of the parameters and 91.92% of the calculations from the VGG13BN network, with only a 2.81% decrease in accuracy. On the CIFAR-100 dataset, we pruned 72.84% of the parameters and 72.03% of the calculations from the VGG13BN network, with an accuracy improvement of 4.11%. © 2023 IEEE.

Keyword:

Convolutional neural networks Convolution Deep neural networks

Author Community:

  • [ 1 ] [Zhang, Zihan]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 2 ] [Xie, Xuesong]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 3 ] [Zhang, Xiaoling]Faculty of Information Technology, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2023

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 4

Affiliated Colleges:

Online/Total:1734/10727800
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.