So far, aspect-based sentiment analysis (ABSA) has involved with total seven subtasks, in which, however the interactions among them have been left unexplored sufficiently. This work presents a novel multiplex cascade framework for unified ABSA and maintaining such interactions. First, we model total seven subtasks as a hierarchical dependency in the easy-to-hard order, based on which we then propose a multiplex decoding mechanism, transferring the sentiment layouts and clues in lower tasks to upper ones. The multiplex strategy enables highly-efficient subtask interflows and avoids repetitive training; meanwhile it sufficiently utilizes the existing data without requiring any further annotation. Further, based on the characteristics of aspect-opinion term extraction and pairing, we enhance our multiplex framework by integrating POS tag and syntactic dependency information for term boundary and pairing identification. The proposed Syntax-aware Multiplex (SyMux) framework enhances the ABSA performances on 28 subtasks (7×4 datasets) with big margins.
In ABSA community there are at least following seven representative subtasks:
All these ABSA subtasks are co-related by revolving around the predictions of three sentiment elements: < aspect, opinion, polarity >.
In this project, we consider the unified ABSA: analyzing all these subtasks with one unified model in one shot. We try to enhance the ABSA subtasks by making full use of the interactions between all subtasks, with a multiplex cascade framework.
We use the Semeval benchmark, including Res14, Lap14, Res15 and Res16. To enable multi-task training for UABSA, we re-ensemble the existing ABSA datasets so that most of the sentences’ annotations cover all seven subtasks.
• Wang et al. (2017) [1] annotate the unpaired opinion terms (denoted as D17).
• Fan et al. (2019) [2] pair the aspects with opinion terms (D19),
• Peng et al. (2020) [3] further provide the labels for triple extraction (D20).
▶ Main results.
▶ Model Ablation.
▶ Separating or sharing? sharing by multiplex!
▶ Multiplexing order in hierarchical dependency.
▶ Model robustness against data scarcity.
@inproceedings{fei2022unifiedABSA,
author = {Hao Fei and Fei Li and Chenliang Li and Shengqiong Wu and Jingye Li and Donghong Ji},
title = {Inheriting the Wisdom of Predecessors: A Multiplex Cascade Framework for Unified Aspect-based Sentiment Analysis},
booktitle = {Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, {IJCAI}},
pages = {4121--4128},
year = {2022},
}