• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Wang, Dujuan (Wang, Dujuan.) | Bao, Changchun (Bao, Changchun.) (Scholars:鲍长春)

Indexed by:

EI Scopus

Abstract:

Deep neural networks (DNN) with skip connections, which is different from the standard feed forward network architecture, is added skip connections between networks. By adding skip connections to all layers of neural networks, the problem of gradient vanishing can be solved, which is beneficial for training deep networks and makes a faster convergence. In addition, more speech signal details are able to be passed by the skip connections, which helps the network to better recover the speech signal. In our paper, firstly, the ideal Wiener filter is chosen as the training target of DNN with skip connections (Skip-DNN) given the cepstral feature of noisy speech signal as its input. Then, we investigate the enhanced speech performance that combines the DNN-based phase estimation in complex domain with the estimated clean speech magnitude by using the ideal Wiener filter and Skip-DNN. The experiments are conducted by using the TIMIT corpus with 102 types of noises at four different signal to noise ratio (SNR) levels. According to the experiments, our proposed methods are able to achieve the higher speech quality and intelligibility than those reference approaches. © 2018 IEEE.

Keyword:

Multilayer neural networks Signal processing Network architecture Feedforward neural networks Speech communication Speech intelligibility Network layers Deep neural networks Signal to noise ratio Speech enhancement

Author Community:

  • [ 1 ] [Wang, Dujuan]Speech and Audio Signal Processing Lab, Beijing University of Technology, Faculty of Information Technology, Beijing; 100124, China
  • [ 2 ] [Bao, Changchun]Speech and Audio Signal Processing Lab, Beijing University of Technology, Faculty of Information Technology, Beijing; 100124, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2018

Volume: 2018-August

Page: 270-275

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 4

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 15

Online/Total:307/10509522
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.