• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Li, Yang (Li, Yang.) | Yang, Sheng Qi (Yang, Sheng Qi.)

Indexed by:

EI Scopus

Abstract:

Due to the rapid development of the Internet, the data generated by people in life is growing at an exponential rate, and Short Message Service (SMS) data is one of the social media products of mobile phone users. The main discussion direction of this article is how to distinguish spam messages and obtain effective information from them, and distinguish the information expressed by the messages themselves. In recent years, Deep learning has made great breakthroughs in the field of textual classification of natural language processing, so this paper will do more research and breakthrough on spam texts using the deep learning method. This article will introduce a Model (RCM) combined with a priori information extracted by Long Short-Term Memory (LSTM) and Convolutional Neural Network (CNN). Firstly, obtain the temporal and spatial information characteristics of the sentence using the bidirectional LSTM as the prior information of the information itself before reading the information, and then merge with the feature information processed by the CNN to improve the accuracy of the spam prediction. Compared with the previous model, there is some innovation in getting the priori information extracted by LSTM, and it is also improve the performance standard. © 2018 IEEE.

Keyword:

Text messaging Cellular telephones Text processing Convolutional neural networks Long short-term memory E-learning Deep learning Mobile telecommunication systems Natural language processing systems Classification (of information) Learning systems

Author Community:

  • [ 1 ] [Li, Yang]Department of Information Technology, School of Software Engineering, Beijing University of Technology, Beijing, China
  • [ 2 ] [Yang, Sheng Qi]Department of Information Technology, School of Software Engineering, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2018

Page: 2327-2331

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 12

Online/Total:370/10550343
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.