Indexed by:
Abstract:
Interval type-2 fuzzy neural network (IT2FNN) is widely used to model nonlinear systems. Unfortunately, the gradient descent-based IT2FNN with uncertain variances always suffers from low convergence speed due to its inherent singularity. To cope with this problem, a nonsingular gradient descent algorithm (NSGDA) is developed to update IT2FNN in this article. First, the widths of type-2 fuzzy rules are transformed into root inverse variances (RIVs) that always satisfy the sufficient condition of differentiability. Second, the singular RIVs are reformulated by the nonsingular Shapley-based matrices associated with type-2 fuzzy rules. It averts the convergence stagnation caused by zero derivatives of singular RIVs, thereby sustaining the gradient convergence. Third, an integrated-form update strategy (IUS) is designed to obtain the derivatives of parameters, including RIVs, centers, weight coefficients, deviations, and proportionality coefficient of IT2FNN. These parameters are packed into multiple subvariable matrices, which are capable to accelerate gradient convergence using parallel calculation instead of sequence iteration. Finally, the experiments showcase that the proposed NSGDA-based IT2FNN can improve the convergence speed through the improved learning algorithm.
Keyword:
Reprint Author's Address:
Email:
Source :
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN: 2162-237X
Year: 2022
Issue: 6
Volume: 35
Page: 8176-8189
1 0 . 4
JCR@2022
1 0 . 4 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:46
JCR Journal Grade:1
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 2
SCOPUS Cited Count: 4
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 6
Affiliated Colleges: