Indexed by:
Abstract:
As extension of traditional echo state networks (ESNs), the polynomial echo state networks (PESNs) have been proposed in our previous work (Yang et al., 208) by employing the polynomial function of complete input variable as output weight matrix. In practice, the generalization performance and computational burden of PESNs are perturbed by redundant or irrelevant inputs. To construct output weights with a suitable subset of input variables, the forward selection based PESN (FS-PESN) and backward selection based PESN (BS-PESN) are proposed. Firstly, the forward selection method is used in FS-PESN to choose the input variable which incurs the maximum reduction on objective function, and the backward selection shame is introduced in BS-PESN to remove the input variable which leads to the smallest increment on objective function. Then, the iterative updating strategies are designed to avoid repetitive computations in FS-PESN and BS-PESN. Specially, an accelerating scheme is introduced into BS-PESN to simplify training process. Finally, numerical simulations are carried out to illustrate effectiveness of the proposed techniques in terms of generalization ability and testing time. (C) 2020 Elsevier B.V. All rights reserved.
Keyword:
Reprint Author's Address:
Email:
Source :
NEUROCOMPUTING
ISSN: 0925-2312
Year: 2020
Volume: 398
Page: 83-94
6 . 0 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:132
Cited Count:
WoS CC Cited Count: 5
SCOPUS Cited Count: 6
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 4
Affiliated Colleges: