• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhang, Li (Zhang, Li.) | Cheng, Xianwei (Cheng, Xianwei.) | Zhao, Hui (Zhao, Hui.) | Mohanty, Saraju P. (Mohanty, Saraju P..) | Fang, Juan (Fang, Juan.) (Scholars:方娟)

Indexed by:

EI Scopus

Abstract:

Convolutional Neural Networks (CNNs) have shown a great potential in different application domains including object detection, image classification, natural language processing, and speech recognition. Since the depth of the neural network architectures keep growing and the requirement of the large-scale dataset, to design a high-performance computing hardware for training CNNs is very necessary. In this paper, we measure the performance of different configuration on GPU platform and learning the patterns through training two CNNs architectures, LeNet and MiniNet, both perform the image classification. Observe the results of measurements, we indicate the correlation between L1D cache and the performance of GPUs during the training process. Also, we demonstrate that L2D cache slightly influences the performance. The network traffic intensity with both CNN models shows that each layer has distinct patterns of traffic intensity. © 2019 IEEE.

Keyword:

Speech recognition Network architecture Natural language processing systems Large dataset Image classification Convolutional neural networks Program processors Object detection

Author Community:

  • [ 1 ] [Zhang, Li]Computer Science and Engineering Department, University of North Texas, United States
  • [ 2 ] [Cheng, Xianwei]Computer Science and Engineering Department, University of North Texas, United States
  • [ 3 ] [Zhao, Hui]Computer Science and Engineering Department, University of North Texas, United States
  • [ 4 ] [Mohanty, Saraju P.]Computer Science and Engineering Department, University of North Texas, United States
  • [ 5 ] [Fang, Juan]Faculty of Information Technology, Beijing University of Technology, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2019

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 1

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 12

Online/Total:431/10558085
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.