Download PDFOpen PDF in browser

Study on Chinese Named Entity Recognition Based on Dynamic Fusion and Adversarial Training

EasyChair Preprint 9055

12 pagesDate: October 24, 2022

Abstract

In this paper, for Chinese named entity recognition task, we use NEZHA Chinese pre-trained language model as the word embedding layer, and then encode it with BiLSTM network architecture, and finally connect CRF layer for optimizing the output sequence, in addition, we perform dynamic fusion of each layer of NEZHA to extract the semantic information of entities more fully, and finally introduce some noise to the input data, which is used for adversarial training to improve the generalization and robustness of the model. The results show that the model and method used in this paper achieve good results in the Chinese NER task, and significantly improve the model training speed.

Keyphrases: Chinese named entity recognition, NEZHA pre-trained language model, Natural Language Processing, adversarial training, dynamic fusion

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:9055,
  author    = {Fan Fei and Linnan Yang and Xingyu Wu and Shengken Lin and Huijie Dong and Changshan Yin},
  title     = {Study on Chinese Named Entity Recognition Based on  Dynamic Fusion and Adversarial Training},
  howpublished = {EasyChair Preprint 9055},
  year      = {EasyChair, 2022}}
Download PDFOpen PDF in browser