Download PDFOpen PDF in browser

TriMod Fusion for Multimodal Named Entity Recognition in Social Media

EasyChair Preprint 13700

10 pagesDate: June 18, 2024

Abstract

Social media platforms serve as invaluable sources of user-generated content, offering insights into various aspects of human behavior. Named Entity Recognition (NER) plays a crucial role in analyzing such content by identifying and categorizing named entities into predefined classes. However, traditional NER models often struggle with the informal, contextually sparse, and ambiguous nature of social media language. To address these challenges, recent research has focused on multimodal approaches that leverage both textual and visual cues for enhanced entity recognition. Despite advances, existing methods face limitations in capturing nuanced mappings between visual objects and textual entities and addressing distributional disparities between modalities. In this paper, we propose a novel approach that integrates textual, visual, and hashtag features (TriMod), utilizing Transformer-attention for effective modality fusion. The improvements exhibited by our model suggest that named entities can greatly benefit from the auxiliary context provided by multiple modalities, enabling more accurate recognition. Through the experiments on a multimodal social media dataset, we demonstrate the superiority of our approach over existing state-of-the-art methods, achieving significant improvements in precision, recall, and F1 score.

Keyphrases: Hashtag Features, Modality Fusion, Multimodal Approaches, Named Entity Recognition (NER), Textual and Visual Integration, Transformer Attention, deep learning, multimodal dataset, social media analysis

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:13700,
  author    = {Mosab Alfaqeeh},
  title     = {TriMod Fusion for Multimodal Named Entity Recognition in Social Media},
  howpublished = {EasyChair Preprint 13700},
  year      = {EasyChair, 2024}}
Download PDFOpen PDF in browser