• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Xiang, Yang (Xiang, Yang.) | Bao, Changchun (Bao, Changchun.) (Scholars:鲍长春)

Indexed by:

EI Scopus

Abstract:

In this paper, we present three strategies to achieve speech enhancement, which is based on Cepstral Mapping and Deep Neural Networks (DNN). Firstly, we apply DNN to directly predict the clean speech Cepstral feature given noisy Cepstral input. Then, by waveform reconstruction, we can obtain desired clean speech. Comparing with the method of directly mapping log-power spectral (LPS), our method is able to be more effective to recover speech harmonic structure and gain the higher speech quality. Additionally, we also utilize DNN to estimate ideal Wiener filter by giving noisy Cepstral input. Finally, a fusion framework is proposed to acquire enhanced speech signal, which combines Cepstral feature mapping and Wiener filter. Experiments show that the proposed algorithms are able to achieve the state-of-the-art performance in improving the quality and intelligibility of noisy speech. © 2018 IEEE.

Keyword:

Neural networks Photomapping Speech intelligibility Speech enhancement Deep neural networks

Author Community:

  • [ 1 ] [Xiang, Yang]Faculty of Information Technology, Beijing University of Technology, Speech and Audio Signal Processing Lab, Beijing, China
  • [ 2 ] [Bao, Changchun]Faculty of Information Technology, Beijing University of Technology, Speech and Audio Signal Processing Lab, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2018

Page: 1263-1267

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 1

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 10

Online/Total:891/10521088
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.