• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Gadosey, Pius Kwao (Gadosey, Pius Kwao.) | Li, Yujian (Li, Yujian.) | Yamak, Peter T. (Yamak, Peter T..)

Indexed by:

EI

Abstract:

With the influx of several kinds of mobile electronic devices alongside the increasing popularity of deep learning networks in performing computer vision tasks, it is natural that demands for delivering them on smaller devices will increase. The authors of this paper review and experiment with compact models (MobileNet V1 and V2, ShuffleNet V1 and V2, FD-MobileNet)) and selected methods of pruning and quantization of popular Convolutional Neural Network (CNN) through transfer learning tasks. They further propose a hybrid technique of per layer pruning and quantization called Pruned Sparse Binary-Weight Network (PSBWN). The performance of these four techniques are evaluated on image classification tasks on the Caltech–UCSD Birds 200, Oxford Flowers 102 and CALTECH256 which are all publicly available benchmark datasets with focus on the trade-offs among the number of Floating Point Operations (FLOPS), model sizes, training and inference times against accuracy using the same computation resources. © 2019 Association for Computing Machinery.

Keyword:

Benchmarking Economic and social effects Convolutional neural networks Deep learning Learning systems Neural networks Transfer learning Cloud computing Classification (of information) Digital arithmetic

Author Community:

  • [ 1 ] [Gadosey, Pius Kwao]Computer Science and Technology, Beijing University of Technology, Beijing, China
  • [ 2 ] [Li, Yujian]School of Artificial Intelligence, Guilin University of Electronic Technology, Guilin, Guangxi, China
  • [ 3 ] [Yamak, Peter T.]Computer Science and Technology, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

Year: 2019

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 4

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 11

Online/Total:438/10552702
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.