Indexed by:
Abstract:
Deep neural network techniques are recently recognized as powerful tools in solving complex and challenging modeling problems of microwave components. However, direct training of a fully connected deep neural network with sigmoid functions using the backpropagation (BP) algorithm is difficult because of the vanishing gradient problem. In this paper, we propose a novel deep neural network modeling technique with batch normalization (BN) to address the vanishing gradient problem. BN layers are added before every sigmoid hidden layer of the deep neural network to normalize the inputs of each sigmoid hidden layer with additional scaling and shifting, thus overcoming the vanishing gradient problem. Automated model generation (AMG) algorithm is also utilized to automatically determine the suitable number of BN layers and sigmoid hidden layers during deep neural network training process. This proposed technique is illustrated by two microwave examples.
Keyword:
Reprint Author's Address:
Email:
Source :
2020 IEEE MTT-S INTERNATIONAL CONFERENCE ON NUMERICAL ELECTROMAGNETIC AND MULTIPHYSICS MODELING AND OPTIMIZATION (NEMO 2020)
Year: 2020
Language: English
Cited Count:
WoS CC Cited Count: 2
SCOPUS Cited Count: 5
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 4
Affiliated Colleges: