Indexed by:
Abstract:
Due to the fact that the conventional radial basis function (RBF) neural network cannot change the structure on-line, a new dynamic structure RBF (D-RBF) neural network is designed in this paper. D-RBF is based on the sensitivity analysis (SA) method to analyze the output values of the hidden nodes for the network output, then the hidden nodes in the RBF neural network can be inserted or pruned. The final structure of D-RBF is not too large or small for the objectives, and the convergence of the dynamic process is investigated in this paper. The grad-descend method for the parameter adjusting ensures the convergence of D-RBF neural network. The structure of the RBF neural network is self-organizing, and the parameters are self-adaptive. In the end, D-RBF is used for the non-linear functions approximation and the non-linear systems modelling. The results show that this proposed D-RBF obtains favorable self-adaptive and approximating ability. Especially, comparisons with the minimal resource allocation networks (MRAN) and the generalized growing and pruning RBF (GGAP-RBF) reveal that the proposed algorithm is more effective in generalization and finally neural network structure. Copyright © 2010 Acta Automatica Sinica. All right reserved.
Keyword:
Reprint Author's Address:
Email:
Source :
Acta Automatica Sinica
ISSN: 0254-4156
Year: 2010
Issue: 6
Volume: 36
Page: 865-872
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 69
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3