DAE-NER: Dual-channel attention enhancement for Chinese named entity recognition

被引:5
作者
Liu, Jingxin [1 ]
Sun, Mengzhe [1 ]
Zhang, Wenhao [3 ]
Xie, Gengquan [2 ]
Jing, Yongxia [1 ]
Li, Xiulai [3 ]
Shi, Zhaoxin [3 ]
机构
[1] Qiongtai Normal Univ, Sch Informat Sci & Technol, Haikou 571127, Peoples R China
[2] Hainan Univ, Sch Int Studies, Haikou 570228, Peoples R China
[3] Hainan Univ, Sch Cyberspace Secur, Haikou 570228, Peoples R China
关键词
Named Entity Recognition; Natural Language Processing; Attention enhancement; Semantic features;
D O I
10.1016/j.csl.2023.101581
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Named Entity Recognition (NER) is an important component of Natural Language Processing (NLP) and is a fundamental yet challenging task in text analysis. Recently, NER models for Chinese-language characters have received considerable attention. Owing to the complexity and ambiguity of the Chinese language, the same semantic features have different levels of importance in different contexts. However, existing literature on Chinese Named Entity recognition (CNER) does not capture this difference in importance. To tackle this problem, we propose a new method, referred to as Dual-channel Attention Enhancement for Chinese Named Entity Recognition (DAE-NER). Specifically, we design compression and decompression mechanisms to adapt Chinese language characters to different contexts. By adjusting the weight of the semantic feature vector, the semantic weight is reconstructed to alleviate the interference of contextual differences in semantics. Moreover, in order to enhance the semantic representation of the different granularities in Chinese text, we design attention enhancement modules at the character and sentence levels. These modules dynamically learn the differences in semantic features to enhance important semantic representations in different dimensions. Extensive experiments on four benchmark datasets, namely MSRA, People Daily, Resume, and Weibo, have demonstrated that the proposed DAE-NER can effectively improve the overall performance of CNER.
引用
收藏
页数:11
相关论文
共 46 条
  • [1] BERT-CNN: A Deep Learning Model for Detecting Emotions from Text
    Abas, Ahmed R.
    Elhenawy, Ibrahim
    Zidan, Mahinda
    Othman, Mahmoud
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 71 (02): : 2943 - 2961
  • [2] Time-Aware PolarisX: Auto-Growing Knowledge Graph
    Ahn, Yeon-Sun
    Jeong, Ok-Ran
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2021, 67 (03): : 2695 - 2708
  • [3] Can BERT Dig It? Named Entity Recognition for Information Retrieval in the Archaeology Domain
    Brandsen, Alex
    Verberne, Suzan
    Lambers, Karsten
    Wansleeben, Milco
    [J]. ACM JOURNAL ON COMPUTING AND CULTURAL HERITAGE, 2022, 15 (03):
  • [4] Cao PF, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P182
  • [5] Chen C, 2021, ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, P20
  • [6] Cheng JR, 2021, KSII T INTERNET INF, V15, P2012
  • [7] Dai X, 2020, Arxiv, DOI arXiv:2010.11683
  • [8] Leveraging Integrated Learning for Open-Domain Chinese Named Entity Recognition
    Diao J.
    Zhou Z.
    Shi G.
    [J]. International Journal of Crowd Science, 2022, 6 (02) : 74 - 79
  • [9] A Chinese named entity recognition method combined with relative position information
    Gan, Ling
    Huang, Chengming
    [J]. 2021 ASIA-PACIFIC CONFERENCE ON COMMUNICATIONS TECHNOLOGY AND COMPUTER SCIENCE (ACCTCS 2021), 2021, : 250 - 254
  • [10] Han XK, 2022, CHIN CONTR CONF, P4227, DOI 10.23919/CCC55666.2022.9902313