Download PDFOpen PDF in browser

A Fine-Grained Sentiment Analysis Model Based on Multi-Task Learning

EasyChair Preprint no. 13048

6 pagesDate: April 19, 2024


Fine-grained sentiment analysis based on textual data is currently a prominent research topic in the realm of natural language processing. The objective of this analysis is to forecast the various dimensions of sentiment within a sentence. Nevertheless, the majority of existing sentiment analysis models predominantly concentrate solely on aspect extraction or sentiment tendency analysis, catering to single-task patterns. In light of the aforementioned issues, this paper introduces a fine-grained sentiment analysis model called BLAB (BERT Local Context Focus AD-BiReGU), which is founded on multi-task learning principles. In this model, the incorporation of the AD-BiReGU module into the BERT-LCF framework enables it to perform aspect word extraction and fine-grained sentiment analysis concurrently. Initially, the pre-trained BERT model is employed to capture the initial features of both local and global contexts. Within the feature extraction layer, local context features are extracted through the incorporation of the local context focus mechanism with the multi-head attention mechanism, facilitating dynamic context feature masking. Simultaneously, the two-layer BiReGU model, grounded in the attention mechanism, is employed to incorporate context information into the neural network model, thereby capturing long-term dependencies between labels and textual features to extract global features. Subsequently, the fusion of local text information and global information occurs, followed by their input into the nonlinear layer to derive the ultimate sentiment polarity results. Comparative experiments indicate that the incorporation of the AD-BiReGU module yields a discernible enhancement in performance for the aspect word extraction task within multitasking scenarios.

Keyphrases: Aspect polarity classification, Aspect-level sentiment analysis, BERT model, feature extraction, multi-task learning

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Xin Fan and Zhonglin Zhang},
  title = {A Fine-Grained Sentiment Analysis Model Based on Multi-Task Learning},
  howpublished = {EasyChair Preprint no. 13048},

  year = {EasyChair, 2024}}
Download PDFOpen PDF in browser