Self-aware cycle curriculum learning for multiple-choice reading comprehension

被引:0
作者
Chen, Haihong [1 ]
Li, Yufei [1 ]
机构
[1] Chifeng Univ, Sch Math & Comp Sci, Chifeng, Peoples R China
关键词
Machine reading comprehension; Multiple-choice; Self-aware; Cycle training strategy; Curriculum learning;
D O I
10.7717/peerj-cs.1179
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multiple-choice reading comprehension task has recently attracted significant interest. The task provides several options for each question and requires the machine to select one of them as the correct answer. Current approaches normally leverage a pre-training and then fine-tuning procedure that treats data equally, ignoring the difficulty of training examples. To solve this issue, curriculum learning (CL) has shown its effectiveness in improving the performance of models. However, previous methods have two problems with curriculum learning. First, most methods are rule-based, not flexible enough, and usually suitable for specific tasks, such as machine translation. Second, these methods arrange data from easy to hard or from hard to easy and overlook the fact that human beings usually learn from easy to difficult, and from difficult to easy when they make comprehension reading tasks. In this article, we propose a novel Self-Aware Cycle Curriculum Learning (SACCL) approach which can evaluate data difficulty from the model's perspective and train the model with cycle training strategy. The experiments show that the proposed approach achieves better performance on the C3 dataset than the baseline, which verifies the effectiveness of SACCL.
引用
收藏
页数:18
相关论文
共 41 条
[11]  
Kocmi T., 2017, P INT C RECENT ADV N, P379
[12]  
Kumar G, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P2054
[13]  
Lai G, 2017, P 2017 C EMPIRICAL M, DOI [DOI 10.18653/V1/D17-1082.URL, 10.18653/v1/d17-1082]
[14]  
Lan Zhenzhong, 2020, 8 INT C LEARN REPR I
[15]  
Liu C, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P4223
[16]  
Liu JH, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P2109
[17]   Neural Machine Reading Comprehension: Methods and Trends [J].
Liu, Shanshan ;
Zhang, Xin ;
Zhang, Sheng ;
Wang, Hui ;
Zhang, Weiming .
APPLIED SCIENCES-BASEL, 2019, 9 (18)
[18]  
Liu XB, 2020, 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), P427
[19]  
Liu YH, 2019, Arxiv, DOI [arXiv:1907.11692, 10.48550/arXiv.1907.11692, DOI 10.48550/ARXIV.1907.11692]
[20]   Curriculum Learning for Speech Emotion Recognition From Crowdsourced Labels [J].
Lotfian, Reza ;
Busso, Carlos .
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2019, 27 (04) :815-826