• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhao, Yunhao (Zhao, Yunhao.) | Bao, Changchun (Bao, Changchun.) | Yang, Xue (Yang, Xue.)

Indexed by:

EI

Abstract:

To enhance the perceptual quality of speech signal, the Packet Loss Concealment (PLC) technique focuses on recovering the lost speech caused by network latency and jitter. In practical applications, the PLC methods typically employ a predictive process that relies on previously received speech signal to recover the lost speech without introducing additional delay. In this paper, we propose a predictive PLC network, which employs the Conformer and temporal convolution module to fully exploit the contextual dependencies and to better predict the lost speech. In addition, our proposed network can be directly employed as the generator and combined with appropriate discriminative networks, forming a Generative Adversarial Network (GAN) paradigm that can enhance the perceptual quality of the recovered speech signal. Experimental results demonstrate that without any discriminative network, the proposed method demonstrates impressive results in PLC. Under the GAN paradigm, further improvement can be observed and our proposed method outperforms several baseline methods at different packet loss rates. © 2023 IEEE.

Keyword:

Packet loss Computer system recovery Recovery Speech communication Generative adversarial networks Convolution

Author Community:

  • [ 1 ] [Zhao, Yunhao]Beijing University of Technology, Speech and Audio Signal Processing Laboratory, Faculty of Information Technology, Beijing; 100124, China
  • [ 2 ] [Bao, Changchun]Beijing University of Technology, Speech and Audio Signal Processing Laboratory, Faculty of Information Technology, Beijing; 100124, China
  • [ 3 ] [Yang, Xue]Beijing University of Technology, Speech and Audio Signal Processing Laboratory, Faculty of Information Technology, Beijing; 100124, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2023

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:496/10577512
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.