Yu-Chih Deng, Cheng-Yu Tsai, Yih-Ru Wang, Sin-Horng Chen, Lung-Hao Lee*.
IEEE Access, 10, pages 126612-126620.
Abstract
Phrase-level sentiment intensity prediction is difficult due to the inclusion of linguistic modifiers (e.g., negators, degree adverbs, and modals) potentially resulting in an intensity shift or polarity reversal for the modified words. This study develops a graph-based Chinese parser based on the deep biaffine attention model to obtain dependency structures and relations. These obtained dependency features are then used in our proposed Weighted-sum Tree GRU network to predict phrase-level sentiment intensity in the valence-arousal dimensions. Dependency parsing results using the Sinica Treebank indicate that our graph-based model outperforms transition-based methods such as MLP and stack-LSTM with identical findings for English dependency parsing. Experimental results on the Chinese EmoBank indicate that our Weighted-sum Tree GRU network model outperforms other transformer-based neural networks such as BERT, ALBERT, XLNET and ELECTRA, reflecting the effectiveness of linguistic dependencies in phrase-level sentiment intensity predication tasks. In addition, our proposed model requires fewer parameters and less inference time for quantitative analysis, making the proposed model is relatively lightweight and efficient.