Multi-task Learning Neural Networks for Comparative Elements Extraction

被引:1
|
作者
Liu, Dianqing [1 ]
Wang, Lihui [1 ]
Shao, Yanqiu [1 ]
机构
[1] Beijing Language & Culture Univ, Sch Informat Sci, Beijing 10083, Peoples R China
来源
CHINESE LEXICAL SEMANTICS (CLSW 2020) | 2021年 / 12278卷
基金
中央高校基本科研业务费专项资金资助; 中国国家自然科学基金;
关键词
Comparative elements extraction; Neural networks; BERT-CRF; Multi-task learning; RULES;
D O I
10.1007/978-3-030-81197-6_33
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Comparative sentences are common in human languages. In online comments, a comparative sentence usually contains the subjective attitude or emotional tendency of a reviewer. Hence, comparative elements extraction (CEE) is valuable for opinion mining and sentiment analysis. Most of the existing CEE systems use rule-based or machine learning approaches that need to construct a rule base or spend a huge amount of effort on feature engineering. These approaches usually involve multiple steps, and the performance of each step relies on the accuracy of the previous step, risking error cascading oversteps. In this paper, we adopt a neural network approach to CEE, which supports end-to-end training and automatic learning of sentence representation. Furthermore, considering the high relevance of CEE and comparative sentences recognition (CSR), we propose a multi-task learning model to combine the two tasks, which can further improve the performance of CEE. Experiment results show that both our neural network approach and multi-task learning are effective for CEE.
引用
收藏
页码:398 / 407
页数:10
相关论文
共 50 条
  • [31] A multi-task learning based approach to biomedical entity relation extraction
    Li, Qingqing
    Yang, Zhihao
    Luo, Ling
    Wang, Lei
    Zhang, Yin
    Lin, Hongfei
    Wang, Jian
    Yang, Liang
    Xu, Kan
    Zhang, Yijia
    PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 680 - 682
  • [32] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [33] Multi-task neural networks by learned contextual inputs
    Sandnes, Anders T.
    Grimstad, Bjarne
    Kolbjornsen, Odd
    NEURAL NETWORKS, 2024, 179
  • [34] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [35] Dual-Task Network for Terrace and Ridge Extraction: Automatic Terrace Extraction via Multi-Task Learning
    Zhang, Jun
    Zhang, Jun
    Huang, Xiao
    Zhou, Weixun
    Fu, Huyan
    Chen, Yuyan
    Zhan, Zhenghao
    REMOTE SENSING, 2024, 16 (03)
  • [36] Empirical evaluation of multi-task learning in deep neural networks for natural language processing
    Jianquan Li
    Xiaokang Liu
    Wenpeng Yin
    Min Yang
    Liqun Ma
    Yaohong Jin
    Neural Computing and Applications, 2021, 33 : 4417 - 4428
  • [37] Multi-Task and Multi-Domain Learning with Tensor Networks
    Garg, Yash
    Prater-Bennette, Ashley
    Asif, M. Salman
    SIGNAL PROCESSING, SENSOR/INFORMATION FUSION, AND TARGET RECOGNITION XXXII, 2023, 12547
  • [38] Multi-task learning for the prediction of wind power ramp events with deep neural networks
    Dorado-Moreno, M.
    Navarin, N.
    Gutierrez, P. A.
    Prieto, L.
    Sperduti, A.
    Salcedo-Sanz, S.
    Hervas-Martinez, C.
    NEURAL NETWORKS, 2020, 123 : 401 - 411
  • [39] Empirical evaluation of multi-task learning in deep neural networks for natural language processing
    Li, Jianquan
    Liu, Xiaokang
    Yin, Wenpeng
    Yang, Min
    Ma, Liqun
    Jin, Yaohong
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (09) : 4417 - 4428
  • [40] Attribute Knowledge Integration for Speech Recognition Based on Multi-task Learning Neural Networks
    Zheng, Hao
    Yang, Zhanlei
    Qiao, Liwei
    Li, Jianping
    Liu, Wenju
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 543 - 547