• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Li, Wenjing (Li, Wenjing.) | Chu, Minghui (Chu, Minghui.) | Qiao, Junfei (Qiao, Junfei.) (Scholars:乔俊飞)

Indexed by:

EI Scopus SCIE PubMed

Abstract:

Approaching to the biological neural network, small-world neural networks have been demonstrated to improve the generalization performance of artificial neural networks. However, the architecture of small-world neural networks is typically large and predefined. This may cause the problems of overfitting and time consuming, and cannot obtain an optimal network structure automatically for a given problem. To solve the above problems, this paper proposes a pruning feedforward small-world neural network (PFSWNN), and applies it to nonlinear system modeling. Firstly, a feedforward small-world neural network (FSWNN) is constructed according to the rewiring rule of Watts-Strogatz. Secondly, the importance of each hidden neuron is evaluated based on its Katz centrality. If the Katz centrality of a hidden neuron is below the predefined threshold, this neuron is considered to be an unimportant node and then merged with its most correlated neuron in the same hidden layer. The connection weights are trained using the gradient-based algorithm, and the convergence of the proposed PFSWNN is theoretically analyzed in this paper. Finally, the PFSWNN model is tested on some problems for nonlinear system modeling, including the approximation for a rapidly changing function, CATS missing time-series prediction, four benchmark problems of UCI public datasets and a practical problem for wastewater treatment process. Experimental results demonstrate that PFSWNN exhibits superior generalization performance by small-world property as well as the pruning algorithm, and the training time of PFSWNN is shortened owning to a compact structure. (C) 2020 Elsevier Ltd. All rights reserved.

Keyword:

Nonlinear system modeling Pruning algorithm Katz centrality Small-world neural network

Author Community:

  • [ 1 ] Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 2 ] Beijing Key Lab Computat Intelligence & Intellige, Beijing, Peoples R China
  • [ 3 ] Beijing Adv Innovat Ctr Future Internet Technol, Beijing 100124, Peoples R China

Reprint Author's Address:

  • [Li, Wenjing]100 Pingleyuan, Beijing 100124, Peoples R China

Show more details

Related Keywords:

Source :

NEURAL NETWORKS

ISSN: 0893-6080

Year: 2020

Volume: 130

Page: 269-285

7 . 8 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:132

Cited Count:

WoS CC Cited Count: 18

SCOPUS Cited Count: 16

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 8

Online/Total:457/10557551
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.