Chinese Named Entity Recognition with Integrated Channel Attention

被引:0
作者
Song, Wei [1 ]
Zheng, He [1 ]
Guo, Wei [2 ]
Ning, Keqing [1 ]
机构
[1] North China Univ Technol, Sch Informat Sci & Technol, Beijing, Peoples R China
[2] North China Univ Technol, Sch Elect & Control Engn, Beijing, Peoples R China
来源
2024 5TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING, ICAICE | 2024年
关键词
Chinese Named Entity Recognition; IDCAN; Efficient Channel Attention; Natural Language Processing;
D O I
10.1109/ICAICE63571.2024.10863887
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Using an Iterated Dilated Convolutional Neural Network (IDCNN) in Chinese Named Entity Recognition (NER) helps capture local information. However, the contribution of information from different positions in a sentence to the judgment of the current position is not uniform, which needs to be considered when expanding a Dilated Convolutional Neural Network (DCNN). To address this, this paper incorporates Efficient Channel Attention (ECA) into DCNN and constructs the It-erated Dilated Convolutional Attention Network (IDCAN). The model focuses on the importance of different channel information and integrates the Linguistically-motivated Bidirectional Encoder Representation from Transformers (LERT), Bidirectional Gated Recurrent Units (BiGRU), and Conditional Random Fields (CRF) to form a Chinese NER model. The model uses LERT for word vector construction, BiGRU for capturing temporal features, IDCAN for extracting key local features, and CRF for feature decoding. On the Daily, MSRA and E-commerce Chinese datasets, the model achieved F1 scores of 96.88%, 95.15% and 90.52%, respectively, outperforming other models.
引用
收藏
页码:55 / 63
页数:9
相关论文
共 22 条
[1]  
AbdelRahman S., 2010, INT J COMPUTER SCI I, V7, P27
[2]  
[Anonymous], 2015, 2015 IEEE INT C BIOI
[3]   Cascaded classifiers for confidence-based chemical named entity recognition [J].
Corbett, Peter ;
Copestake, Ann .
BMC BIOINFORMATICS, 2008, 9 (Suppl 11)
[4]  
Cui Y., 2022, arXiv preprint arXiv
[5]  
Cui YM, 2022, Arxiv, DOI [arXiv:2211.05344, DOI 10.48550/ARXIV.2211.05344]
[6]   Pre-Training With Whole Word Masking for Chinese BERT [J].
Cui, Yiming ;
Che, Wanxiang ;
Liu, Ting ;
Qin, Bing ;
Yang, Ziqing .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 :3504-3514
[7]  
Dai Z., 2019, 2019 12 INT C IM SIG, P1, DOI [10.1109/CISP-BMEI48845.2019.8965823, DOI 10.1109/CISP-BMEI48845.2019.8965823]
[8]   Named Entity Recognition of Traditional Chinese Medicine Patents Based on BiLSTM-CRF [J].
Deng, Na ;
Fu, Hao ;
Chen, Xu .
WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2021, 2021
[9]  
Devlin J, 2019, Arxiv, DOI arXiv:1810.04805
[10]  
Hu J, 2018, PROC CVPR IEEE, P7132, DOI [10.1109/TPAMI.2019.2913372, 10.1109/CVPR.2018.00745]