Chinese power dispatching text entity recognition based on a double-layer BiLSTM and multi-feature fusion

被引:13
|
作者
Wang, Min [1 ]
Zhou, Tao [1 ]
Wang, Haohao [2 ]
Zhai, Youchun [1 ]
Dong, Xiaobin [1 ]
机构
[1] Hohai Univ, Coll Energy & Elect Engn, Nanjing 211100, Peoples R China
[2] NARI Technol Co Ltd, Nanjing 211106, Peoples R China
关键词
Power dispatching text; NER; Deep learning; Double-layer BiLSTM; Multiple features;
D O I
10.1016/j.egyr.2022.02.272
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
A large amount of unstructured data has been accumulated in the daily dispatching work of power systems as the form of text. In order to use these texts effectively, entities in the text need to be recognized, such as names of station and equipment. Because of the complex composition of the power dispatching text, this paper first summarizes the characteristics of the power dispatching text. A character-level entity recognition model based on multiple features is proposed, which is suitable for power text. Our model combines pretrained character embedding, left-neighbour entropy, and part-of-speech to represent the domain characteristics of power dispatching text. And we exploit the fusion method for multiple features inputting. The double-layer BiLSTM proposed in this paper is used to predict character sequence labels, and finally CRF is used to optimize the label predicted. This paper chooses a power outage maintenance application to recognize name entities. The results of experiments show that our model can increase the overall F-1 value by 2.26% compared with the traditional models, and the recognition of lines and stations has increased by 3.88% and 3.99%. The recognition accuracy of each tag has been enhanced. (c) 2022 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the scientific committee of the 2020 The International Conference on Power Engineering, ICPE, 2020.
引用
收藏
页码:980 / 987
页数:8
相关论文
共 50 条
  • [41] Recognition method of dance rotation based on multi-feature fusion
    Liu, Yang
    Fan, Meiyan
    Xu, Wenfeng
    INTERNATIONAL JOURNAL OF ARTS AND TECHNOLOGY, 2021, 13 (02) : 91 - 107
  • [42] A Research on the Fruit Recognition Algorithm Based on the Multi-Feature Fusion
    Tang, Yanfeng
    Zhang, Yawan
    Zhu, Ying
    2020 5TH INTERNATIONAL CONFERENCE ON MECHANICAL, CONTROL AND COMPUTER ENGINEERING (ICMCCE 2020), 2020, : 1865 - 1869
  • [43] Chinese Sentence Similarity Computing Based on Multi-Feature Fusion
    Liu, Wei
    Du, QinSheng
    2011 INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS AND NEURAL COMPUTING (FSNC 2011), VOL I, 2011, : 85 - 88
  • [44] Multi-Feature Fusion Based Approach for Robust Face Recognition
    Essa, Almabrok
    Asari, Vijayan
    MOBILE MULTIMEDIA/IMAGE PROCESSING, SECURITY, AND APPLICATIONS 2018, 2018, 10668
  • [45] An Entity Recognition Model Based on Deep Learning Fusion of Text Feature
    Shang, Fengjun
    Ran, Chunfu
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (02)
  • [46] An Entity Relation Extraction Method Based on Dynamic Context and Multi-Feature Fusion
    Ma, Xiaolin
    Wu, Kaiqi
    Kuang, Hailan
    Liu, Xinhua
    APPLIED SCIENCES-BASEL, 2022, 12 (03):
  • [47] Research on multi-feature fusion entity relation extraction based on deep learning
    Xu, Shiao
    Sun, Shuihua
    Zhang, Zhiyuan
    Xu, Fan
    INTERNATIONAL JOURNAL OF AD HOC AND UBIQUITOUS COMPUTING, 2022, 39 (1-2) : 93 - 104
  • [48] Medical Named Entity Recognition Based on Multi-Feature and Co-Attention
    Xinning, L.I.U.
    Computer Engineering and Applications, 2024, 60 (06) : 188 - 198
  • [49] Speech emotion recognition based on multi-feature and multi-lingual fusion
    Wang, Chunyi
    Ren, Ying
    Zhang, Na
    Cui, Fuwei
    Luo, Shiying
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (04) : 4897 - 4907
  • [50] Chinese named entity recognition method based on multiscale feature fusion
    Jiang, Xiaoguang
    INTERNATIONAL JOURNAL OF BIOMETRICS, 2024, 16 (3-4) : 337 - 349