• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Quan, Wei (Quan, Wei.) | Zhang, Jinli (Zhang, Jinli.) | Hu, Xiaohua Tony (Hu, Xiaohua Tony.)

Indexed by:

EI

Abstract:

Opinion mining has raised growing interest both in industry and academia in the past decade. Opinion role labeling (ORL) is a task to extract opinion holder and target from natural language to answer the question 'who express what'. Recent years, neural network based methods with additional lexical and syntactic features have achieved state-of-the-art performances in similar tasks. Moreover, Bidirectional Encoder Representations from Transformers (BERT) has shown impressive performances among a variety of natural language processing (NLP) tasks. To investigate BERT based end-to-end model in ORL, we propose models using BERT, Bidirectional Long short-term Memory (BiLSTM) and Conditional Random Field (CRF) to jointly extract opinion roles (e.g., opinion holder and target). Experimental results show that our models achieve remarkable scores without using extra syntactic and/or semantic features. To our best knowledge, we are among the pioneers to successfully integrate BERT in this manner. Our work contributes to the improvement of state-of-the-art aspect-level opinion mining methods and providing strong baselines for future work. © 2019 IEEE.

Keyword:

Data mining Mining Semantics Random processes Syntactics Deep learning Big data Sentiment analysis

Author Community:

  • [ 1 ] [Quan, Wei]Drexel University Philadelphia, College of Computing and Informatics, PA, United States
  • [ 2 ] [Zhang, Jinli]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 3 ] [Hu, Xiaohua Tony]Drexel University Philadelphia, College of Computing and Informatics, PA, United States

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2019

Page: 2438-2446

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 10

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 8

Online/Total:750/10548401
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.