• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Lin, Yuhan (Lin, Yuhan.) | Lei, Minglong (Lei, Minglong.) | Niu, Lingfeng (Niu, Lingfeng.)

Indexed by:

EI

Abstract:

Deep neural networks(DNNs) have achieved great success in many real-world applications, but they also had some drawbacks such as considerable storage requirement, large computational power consumption and delay for training and inference, making it impracticable to deploy state-of-the-art models into embedded systems and portable devices. Thus, the demand of compressing DNNs has been taken into consideration. In this paper, we focus on quantized neural networks, which is one scheme of compressing DNNs. At first we introduce some baseline works in quantized neural networks and then give a review on optimization ways used in quantizing neural networks. In our perspective, these methods fall into two categories: minimizing quantization error and minimizing loss function. Specialized introduction for each category follows after baseline works. We also make some comments to each category and some methods. Finally, we discuss on some possible directions of this area and make a conclusion. © 2019 IEEE.

Keyword:

Digital storage Embedded systems Neural networks Data mining Deep neural networks

Author Community:

  • [ 1 ] [Lin, Yuhan]School of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing, China
  • [ 2 ] [Lei, Minglong]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 3 ] [Niu, Lingfeng]School of Economics and Management, Chinese Academy of Sciences, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 2375-9232

Year: 2019

Volume: 2019-November

Page: 385-390

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 5

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 6

Online/Total:428/10617140
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.