Anaphora Resolution of Uyghur Personal Pronouns Based on Multi-attention Mechanism

被引:0
作者
Yang Q.-M. [1 ,2 ,3 ]
Yu L. [2 ,3 ,4 ]
Tian S.-W. [1 ,2 ,3 ]
Wumaier A. [2 ,3 ,5 ]
机构
[1] School of Software, Xinjiang University, Urumqi
[2] Key Laboratory of software engineering technology, Xinjiang University, Urumqi
[3] Key Laboratory of Signal and Information Processing, Xinjiang University, Urumqi
[4] Network Center, Xinjiang University, Urumqi
[5] College of formation Science and Technology, Xinjiang University, Urumqi
来源
Zidonghua Xuebao/Acta Automatica Sinica | 2021年 / 47卷 / 06期
基金
中国国家自然科学基金;
关键词
Anaphora resolution; Attention mechanism; Context; Independently recurrent neural network;
D O I
10.16383/j.aas.c180678
中图分类号
学科分类号
摘要
The deep neural network model learns the semantic information of anaphora and candidate antecedent, ignores the importance of each word in the sentence, and cannot pay attention to the continuous association and dependence of the word sequence. This paper proposes a Uyghur personal pronoun anaphora resolution method based on contextual multi-attention independent recurrent neural network (CMAIR). Compared with deep neural networks that rely only on the semantic information of anaphora and candidate antecedent, this method can analyze context relations, mine word sequence dependencies, and improve feature expression ability. At the same time, this method combines the multi-attention mechanism, pays attention to the multi-layer semantic features to be resolved, effectively compensates for the lack of content-level features, and effectively recognizes the relationship between personal pronouns and entities. The precision rate of this method in the Uyghur personal pronoun anaphora resolution task is 90.79 %, the recall rate is 83.25 %, and the F value is 86.86 %. The experimental results show that the CMAIR model can significantly improve the performance of Uyghur personal pronoun anaphora resolution. Copyright © 2021 Acta Automatica Sinica. All rights reserved.
引用
收藏
页码:1412 / 1421
页数:9
相关论文
共 25 条
  • [1] Zelenko D, Aone C, Tibbetts J., Coreference resolution for information extraction, Proceedings of the 2004 ACL Workshop on Reference Resolution and its Applications, pp. 9-16, (2004)
  • [2] Deemter K V, Kibble R., On coreferring: Coreference in muc and related annotation schemes, Computational Linguistics, 26, 4, pp. 629-637, (2000)
  • [3] Kim Y., Convolutional neural networks for sentence classification, Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 1746-1751, (2014)
  • [4] Irsoy O, Cardie C., Opinion mining with deep recurrent neural networks, Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 720-728, (2014)
  • [5] Tai K S, Socher R, Manning C D., Improved semantic representations from tree-structured long short-term memory networks, Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 1556-1566, (2015)
  • [6] Chen C, Ng V., Chinese zero pronoun resolution with deep neural networks, Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 778-788, (2016)
  • [7] Chen C, Ng V., Deep reinforcement learning for mention-ranking coreference models, Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 2256-2262, (2016)
  • [8] Iida R, Torisawa K, Oh J H., Intra-sentential subject zero anaphora resolution using multi-column convolutional neural network, Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pp. 1244-1254, (2016)
  • [9] Mnih V, Heess N, Graves A., Recurrent models of visual attention, Proceedings of the Advances in Neural Information Processing Systems, pp. 2204-2212, (2014)
  • [10] Bahdanau D, Cho K, Bengio Y., Neural machine translation by jointly learning to align and translate