Flexibly utilizing syntactic knowledge in aspect-based sentiment analysis

Xiaosai Huang, Jing Li*, Jia Wu, Jun Chang, Donghua Liu, Kai Zhu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

18 Citations (Scopus)

Abstract

Aspect-based sentiment analysis (ABSA) refers to ascertaining the propensity of sentiment expressed in a text towards a particular aspect. While previous models have utilized dependency graphs and GNNs to facilitate information exchange, they face challenges such as smoothing of aspect representation and a gap between word-based dependency graphs and subword-based BERT. Taking into account the above deficiencies, we argue for a new approach called SRE-BERT that flexibly utilizes syntax knowledge to enhance aspect representations by relying on syntax representations. First, we propose a syntax representation encoder to acquire the syntactic vector for each token. Then, we devise a syntax-guided transformer that employs syntax representation to compute multi-head attention, thereby enabling direct syntactic interaction between any two tokens. Finally, the token-level vectors derived from the syntax-guided transformer are employed to enhance the semantic representations obtained by BERT. In addition, during the aforementioned process, we introduced a Masked POS Label Prediction (MPLP) method to pre-train the syntax encoder. A wide range of studies have been undertaken on data collections covering three distinct fields, and the results indicate that our SRE-BERT outperforms the second-ranked model by 1.97%, 1.55%, and 1.20% on the Rest14, Lap14, and Twitter 3 datasets, respectively.

Original languageEnglish
Article number103630
Pages (from-to)1-20
Number of pages20
JournalInformation Processing and Management
Volume61
Issue number3
DOIs
Publication statusPublished - May 2024

Keywords

  • Aspect-based sentiment analysis
  • BERT
  • Syntax representation
  • Syntax-guided transformer

Cite this