TY - JOUR
T1 - Flexibly utilizing syntactic knowledge in aspect-based sentiment analysis
AU - Huang, Xiaosai
AU - Li, Jing
AU - Wu, Jia
AU - Chang, Jun
AU - Liu, Donghua
AU - Zhu, Kai
PY - 2024/5
Y1 - 2024/5
N2 - Aspect-based sentiment analysis (ABSA) refers to ascertaining the propensity of sentiment expressed in a text towards a particular aspect. While previous models have utilized dependency graphs and GNNs to facilitate information exchange, they face challenges such as smoothing of aspect representation and a gap between word-based dependency graphs and subword-based BERT. Taking into account the above deficiencies, we argue for a new approach called SRE-BERT that flexibly utilizes syntax knowledge to enhance aspect representations by relying on syntax representations. First, we propose a syntax representation encoder to acquire the syntactic vector for each token. Then, we devise a syntax-guided transformer that employs syntax representation to compute multi-head attention, thereby enabling direct syntactic interaction between any two tokens. Finally, the token-level vectors derived from the syntax-guided transformer are employed to enhance the semantic representations obtained by BERT. In addition, during the aforementioned process, we introduced a Masked POS Label Prediction (MPLP) method to pre-train the syntax encoder. A wide range of studies have been undertaken on data collections covering three distinct fields, and the results indicate that our SRE-BERT outperforms the second-ranked model by 1.97%, 1.55%, and 1.20% on the Rest14, Lap14, and Twitter 3 datasets, respectively.
AB - Aspect-based sentiment analysis (ABSA) refers to ascertaining the propensity of sentiment expressed in a text towards a particular aspect. While previous models have utilized dependency graphs and GNNs to facilitate information exchange, they face challenges such as smoothing of aspect representation and a gap between word-based dependency graphs and subword-based BERT. Taking into account the above deficiencies, we argue for a new approach called SRE-BERT that flexibly utilizes syntax knowledge to enhance aspect representations by relying on syntax representations. First, we propose a syntax representation encoder to acquire the syntactic vector for each token. Then, we devise a syntax-guided transformer that employs syntax representation to compute multi-head attention, thereby enabling direct syntactic interaction between any two tokens. Finally, the token-level vectors derived from the syntax-guided transformer are employed to enhance the semantic representations obtained by BERT. In addition, during the aforementioned process, we introduced a Masked POS Label Prediction (MPLP) method to pre-train the syntax encoder. A wide range of studies have been undertaken on data collections covering three distinct fields, and the results indicate that our SRE-BERT outperforms the second-ranked model by 1.97%, 1.55%, and 1.20% on the Rest14, Lap14, and Twitter 3 datasets, respectively.
KW - Aspect-based sentiment analysis
KW - BERT
KW - Syntax representation
KW - Syntax-guided transformer
UR - http://www.scopus.com/inward/record.url?scp=85182021580&partnerID=8YFLogxK
U2 - 10.1016/j.ipm.2023.103630
DO - 10.1016/j.ipm.2023.103630
M3 - Article
AN - SCOPUS:85182021580
SN - 0306-4573
VL - 61
SP - 1
EP - 20
JO - Information Processing and Management
JF - Information Processing and Management
IS - 3
M1 - 103630
ER -