Indexed by:
Abstract:
Spatial architecture neural network (SANN), which is inspired by the connecting mode of excitatory pyramidal neurons and inhibitory interneurons of neocortex, is a multilayer artificial neural network and has good learning accuracy and generalization ability when used in real applications. However, the backpropagation-based learning algorithm (named BP-SANN) may be time consumption and slow convergence. In this paper, a new fast and accurate two-phase sequential learning scheme for SANN is hereby introduced to guarantee the network performance. With this new learning approach (named SFSL-SANN), only the weights connecting to output neurons will be trained during the learning process. In the first phase, a least-squares method is applied to estimate the span-output-weight on the basis of the fixed randomly generated initialized weight values. The improved iterative learning algorithm is then used to learn the feedforward-output-weight in the second phase. Detailed effectiveness comparison of SFSL-SANN is done with BP-SANN and other popular neural network approaches on benchmark problems drawn from the classification, regression and time-series prediction applications. The results demonstrate that the SFSL-SANN is faster convergence and time-saving than BP-SANN, and produces better learning accuracy and generalization performance than other approaches. © 2014 Elsevier B.V. All rights reserved.
Keyword:
Reprint Author's Address:
Email:
Source :
Applied Soft Computing Journal
ISSN: 1568-4946
Year: 2014
Volume: 25
Page: 129-138
8 . 7 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:188
JCR Journal Grade:1
CAS Journal Grade:2
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 5
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 7
Affiliated Colleges: