Indexed by:
Abstract:
Intent recognition and slot filling play crucial roles in natural language processing. Therefore, this study aims to enhance the performance of these two tasks. Given the close interrelation between intent recognition and slot filling, this paper takes a feature encoding model as the baseline and augments it with a GRU network decoding layer and a TextCNN-based local semantic information feature representation layer. This joint neural network model facilitates feature extraction in both temporal and spatial dimensions and incorporates a keyword attention mechanism, enabling it to capture the expression of contextual semantic information more precisely. Model training is conducted using the PGB adversarial training method to enhance the model's resilience against attacks. Additionally, an asynchronous training strategy is employed to enable multiple models to learn and adapt independently, accelerating the model training process and enhancing its ability to learn and capture contextual semantic information more effectively. © 2024 SPIE.
Keyword:
Reprint Author's Address:
Email:
Source :
ISSN: 0277-786X
Year: 2024
Volume: 13180
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: