Indexed by:
Abstract:
Deep neural networks (DNN) with skip connections, which is different from the standard feed forward network architecture, is added skip connections between networks. By adding skip connections to all layers of neural networks, the problem of gradient vanishing can be solved, which is beneficial for training deep networks and makes a faster convergence. In addition, more speech signal details are able to be passed by the skip connections, which helps the network to better recover the speech signal. In our paper, firstly, the ideal Wiener filter is chosen as the training target of DNN with skip connections (Skip-DNN) given the cepstral feature of noisy speech signal as its input. Then, we investigate the enhanced speech performance that combines the DNN-based phase estimation in complex domain with the estimated clean speech magnitude by using the ideal Wiener filter and Skip-DNN. The experiments are conducted by using the TIMIT corpus with 102 types of noises at four different signal to noise ratio (SNR) levels. According to the experiments, our proposed methods are able to achieve the higher speech quality and intelligibility than those reference approaches. © 2018 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2018
Volume: 2018-August
Page: 270-275
Language: English
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 4
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 15
Affiliated Colleges: