• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Ur Rehman, Sadaqat (Ur Rehman, Sadaqat.) | Tu, Shanshan (Tu, Shanshan.) | Huang, Yongfeng (Huang, Yongfeng.)

Indexed by:

EI Scopus

Abstract:

Training of convolution neural network (CNN) is a problem of global optimization. We hypothesize that the more smooth and optimize the training of CNN goes, the more efficient the end rsult becomes. Therefore, in this short paper, we propose a modified resilient backpropagation (MRPROP) algorithm to improve the convergence and efficiency of CNN, in which global best concept is introduced in weight updating criteria, to allow the training algorithm of CNN to optimize its weights more swiftly and precisely to find a good solution. Experimental results demonstrate that MRPROP outperforms previous benchmark algorithms and helps in improving training speed and classification accuracy on a public face and skin dataset [1] up to 4X (four times) and 2% respectively. In RPROP [2], the change in weight δw depends on the updated value δx,y increased or decreased according to the error, in order to reach a better solution. However, the previously updated values are neglected after every iteration. It means that all the best values previously achieved in weight change would not be referring back. Hence, there is no information sharing between the best values that have been achieved at the previous iterations with the current result. Therefore, by using the term 'global best' concept in MRPROP, the information of previous weight change is the only guide for the accurate results. Thus, the past best value is selected in term of optimized solution from all updated values of the current weight change and is used to update the process. This variable is called global best 'gbst'. The gbst selection procedure in MRPROP is: First, select two best updated values randomly from all the current change in weight δw. Then, compare these two values in term of optimized solution and choose the better one as gbst. © Springer International Publishing AG 2017.

Keyword:

Backpropagation Iterative methods Global optimization Neural networks Classification (of information) Genetic algorithms

Author Community:

  • [ 1 ] [Ur Rehman, Sadaqat]Department of Electronic Engineering, Tsinghua University, Beijing, China
  • [ 2 ] [Tu, Shanshan]Faculty of IT, Beijing University of Technology, Beijing, China
  • [ 3 ] [Huang, Yongfeng]Department of Electronic Engineering, Tsinghua University, Beijing, China

Reprint Author's Address:

  • [ur rehman, sadaqat]department of electronic engineering, tsinghua university, beijing, china

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 0302-9743

Year: 2017

Volume: 10614 LNCS

Page: 737-738

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 9

Affiliated Colleges:

Online/Total:647/10671932
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.