Da-bert, Enhancing part-of-speech tagging of aspect sentiment analysis using bert

Abstract

With the development of Internet, text-based data from web have grown exponentially where the data carry large amount of valuable information. As a vital branch of sentiment analysis, the aspect sentiment analysis of short text on social media has attracted interests of researchers. Aspect sentiment classifi-cation is a kind of fine-grained textual sentiment classification. Currently, the attention mechanism is mainly combined with RNN (Recurrent Neural Network) or LSTM (Long Short-Term Memory) networks. Such neural network-based sen-timent analysis model not only has a complicated computational structure, but also has computational dependence. To address the above problems and improve the accuracy of the target-based sentiment classification for short text, we pro-pose a neural network model that combines deep-attention with Bidirectional En-coder Representations from Transformers (DA-BERT). The DA-BERT model can fully mine the relationships between target words and emotional words in a sentence, and it does not require syntactic analysis of sentences or external knowledge such as sentiment lexicon. The training speed of the proposed DA-BERT model has been greatly improved while removing the computational de-pendencies of RNN structure. Compared with LSTM, TD-LSTM, TC-LSTM, AT-LSTM, ATAE-LSTM, and PAT-LSTM, the results of experiments on the dataset SemEval2014 Task4 show that the accuracy of the DA-BERT model is improved by 13.63% on average

Publication
In “Advanced Parallel Processing Technologies, 13th International Symposium