• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Qiao, Junfei (Qiao, Junfei.) (Scholars:乔俊飞) | Li, Sanyi (Li, Sanyi.) | Han, Honggui (Han, Honggui.) (Scholars:韩红桂) | Wang, Dianhui (Wang, Dianhui.)

Indexed by:

EI Scopus SCIE

Abstract:

Feedforward neural networks (FNNs) with a single hidden layer have been widely applied in data modeling due to its' universal approximation capability to nonlinear maps. However, such a theoretical result does not provide with any guideline to determine the architecture of the model in practice. Thus, researches on self-organization of FNNs are useful and critical for effective data modeling. This paper proposes a hybrid constructing and pruning strategy (HCPS) for problem solving, where the mutual information (MI) and sensitivity analysis (SA) are employed to measure the amount of internal information of neurons at the hidden layer and the contribution rate of each hidden neuron, respectively. HCPS merges hidden neurons when their MI value becomes too high, deletes hidden neurons when their contribution rates are sufficiently small, and splits hidden neurons when their contribution rates are very big. For each instant pattern feed into the model as a training sample, the weights of the neural network will be updated to ensure the model's output unchanged during structural adjustment. HCPS aims to get a condensed model through eliminating redundant neurons and without degrading the instant modeling performance, which is associated with the model's generalization property. The proposed algorithm is evaluated by some benchmark data sets, including classification problems, a non-linear system identification problem, a time-series prediction problem, and a real world application for pM(2.5) predictions. Simulation results with comparisons demonstrate that our proposed method performs favorably and has improved the existing work in terms of modeling performance. (C) 2017 Elsevier B.V. All rights reserved.

Keyword:

Self-organization structure Feedforward neural network Sensitivity analysis Mutual information

Author Community:

  • [ 1 ] [Qiao, Junfei]Beijing Univ Technol, Coll Elect Informat & Control Engn, Beijing 100124, Peoples R China
  • [ 2 ] [Li, Sanyi]Beijing Univ Technol, Coll Elect Informat & Control Engn, Beijing 100124, Peoples R China
  • [ 3 ] [Han, Honggui]Beijing Univ Technol, Coll Elect Informat & Control Engn, Beijing 100124, Peoples R China
  • [ 4 ] [Wang, Dianhui]Beijing Univ Technol, Coll Elect Informat & Control Engn, Beijing 100124, Peoples R China
  • [ 5 ] [Qiao, Junfei]Beijing Key Lab Computat Intelligence & Intellige, Beijing 100124, Peoples R China
  • [ 6 ] [Li, Sanyi]Beijing Key Lab Computat Intelligence & Intellige, Beijing 100124, Peoples R China
  • [ 7 ] [Han, Honggui]Beijing Key Lab Computat Intelligence & Intellige, Beijing 100124, Peoples R China
  • [ 8 ] [Wang, Dianhui]La Trobe Univ, Dept Comp Sci & Informat Technol, Melbourne, Vic 3083, Australia

Reprint Author's Address:

  • [Wang, Dianhui]La Trobe Univ, Dept Comp Sci & Informat Technol, Melbourne, Vic 3083, Australia

Show more details

Related Keywords:

Source :

NEUROCOMPUTING

ISSN: 0925-2312

Year: 2017

Volume: 262

Page: 28-40

6 . 0 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:175

CAS Journal Grade:2

Cited Count:

WoS CC Cited Count: 24

SCOPUS Cited Count: 29

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 0

Online/Total:496/10577716
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.