• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Wang, Dujuan (Wang, Dujuan.) | Bao, Changchun (Bao, Changchun.) (Scholars:鲍长春)

Indexed by:

EI

Abstract:

The purpose of speech enhancement is to extract useful speech signal from noisy speech. The performance of speech enhancement has been improved greatly in recent years with fast development of the deep learning. However, these studies mainly focus on the frequency domain, which needs to complete time-frequency transformation and the phase information of speech is ignored. Therefore, the end-to-end (i.e. waveform-in and waveform-out) speech enhancement was investigated, which not only avoids fixed time-frequency transformation but also allows modelling phase information. In this paper, a fully convolutional network with skip connections (SC-FCN) for end-to-end speech enhancement is proposed. Without the fully connected layers, this network can effectively characterize local information of speech signal, and better restore high frequency components of waveform using lesser number of the parameters. Meanwhile, because of existence of skip connections in different layers, it is easier to train deep networks and the problem of gradient vanishing can also be tackled. In addition, these skip connections can obtain more details of speech signal in different convolutional layers, which is beneficial for recovering the original speech signal. According to our experimental results, the proposed method can recover the waveform better. © 2019 IEEE.

Keyword:

Signal reconstruction Convolutional neural networks Frequency domain analysis Convolution Speech communication Speech enhancement Deep learning Network layers

Author Community:

  • [ 1 ] [Wang, Dujuan]Speech and Audio Signal Processing Lab, Beijing University of Technology, Beijing; 100124, China
  • [ 2 ] [Bao, Changchun]Speech and Audio Signal Processing Lab, Beijing University of Technology, Beijing; 100124, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2019

Page: 890-895

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 2

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 9

Online/Total:755/10548484
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.