Context Aware Automatic Subjective and Objective Question Generation using Fast Text to Text Transfer Learning

被引:0
|
作者
Agrawal, Arpit [1 ]
Shukla, Pragya [1 ]
机构
[1] DAVV Indore, Inst Engn & Technol, Indore, India
关键词
Automatic question generation; Text-to-Text-transfer-transformer (T5); natural language processing; word sense disambiguation (WSD); domain adaptation; multipartite graphs; beam-search decoding;
D O I
10.14569/IJACSA.2023.0140451
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Online learning has gained a tremendous popularity in the last decade due to the facility to learn anytime, anything, anywhere from the ocean of web resources available. Especially the lockdown all over the world due to the Covid-19 pandemic has brought an enormous attention towards the online learning for value addition and skills development not only for the school/college students, but also to the working professionals. This massive growth in online learning has made the task of assessment very tedious and demands training, experience and resources. Automatic Question generation (AQG) techniques have been introduced to resolve this problem by deriving a question bank from the text documents. However, the performance of conventional AQG techniques is subject to the availability of large labelled training dataset. The requirement of deep linguistic knowledge for the generation of heuristic and hand-crafted rules to transform declarative sentence into interrogative sentence makes the problem further complicated. This paper presents a transfer learning-based text to text transformation model to generate the subjective and objective questions automatically from the text document. The proposed AQG model utilizes the Text-to-Text-Transfer-Transformer (T5) which reframes natural language processing tasks into a unified text-to-text-format and augments it with word sense disambiguation (WSD), ConceptNet and domain adaptation framework to improve the meaningfulness of the questions. Fast T5 library with beam-search decoding algorithm has been used here to reduce the model size and increase the speed of the model through quantization of the whole model by Open Neural Network Exchange (ONNX) framework. The keywords extraction in the proposed framework is performed using the Multipartite graphs to enhance the context awareness. The qualitative and quantitative performance of the proposed AQG model is evaluated through a comprehensive experimental analysis over the publicly available Squad dataset.
引用
收藏
页码:456 / 463
页数:8
相关论文
共 19 条
  • [1] Automatic Multiple Choice Question Generation From Text: A Survey
    Rao, Dhawaleswar C. H.
    Saha, Sujan Kumar
    IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES, 2020, 13 (01): : 14 - 25
  • [2] Toward the Automatic Generation of an Objective Function for Extractive Text Summarization
    Hernandez-Castaneda, Angel
    Garcia-Hernandez, Rene Arnulfo
    Ledeneva, Yulia
    IEEE ACCESS, 2023, 11 : 51455 - 51464
  • [3] KAT5: Knowledge-Aware Transfer Learning with a Text-to-Text Transfer Transformer
    Sohrab, Mohammad Golam
    Miwa, Makoto
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-APPLIED DATA SCIENCE TRACK, PT IX, ECML PKDD 2024, 2024, 14949 : 157 - 173
  • [4] Automatic Multiple-Choice Question Generation from Thai Text
    Kwankajornkiet, Chonlathorn
    Suchato, Atiwong
    Punyabukkana, Proadpran
    2016 13TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER SCIENCE AND SOFTWARE ENGINEERING (JCSSE), 2016, : 308 - 313
  • [5] Remember the Facts? Investigating Answer-Aware Neural Question Generation for Text Comprehension
    Steuer, Tim
    Filighera, Anna
    Rensing, Christoph
    ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2020), PT I, 2020, 12163 : 512 - 523
  • [6] A Review of Text Style Transfer Using Deep Learning
    Toshevska M.
    Gievska S.
    IEEE Transactions on Artificial Intelligence, 2022, 3 (05): : 669 - 684
  • [7] Ensemble-NQG-T5: Ensemble Neural Question Generation Model Based on Text-to-Text Transfer Transformer
    Hwang, Myeong-Ha
    Shin, Jikang
    Seo, Hojin
    Im, Jeong-Seon
    Cho, Hee
    Lee, Chun-Kwon
    APPLIED SCIENCES-BASEL, 2023, 13 (02):
  • [8] Learning Relevant Models using Symbolic Regression for Automatic Text Summarization
    Vazquez, Eder
    Ledeneva, Yulia
    Arnulfo Garcia-Hernandez, Rene
    COMPUTACION Y SISTEMAS, 2019, 23 (01): : 127 - 141
  • [9] Answer-Focused and Position-Aware Neural Network for Transfer Learning in Question Generation
    Zi, Kangli
    Sun, Xingwu
    Cao, Yanan
    Wang, Shi
    Feng, Xiaoming
    Ma, Zhaobo
    Cao, Cungen
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2019, PT II, 2019, 11776 : 339 - 352
  • [10] Automatic Categorization of Islamic Jurisprudential Legal Questions using Hierarchical Deep Learning Text Classifier
    AlSabban, Wesam H.
    Alotaibi, Saud S.
    Farag, Abdullah Tarek
    Rakha, Omar Essam
    Al Sallab, Ahmad A.
    Alotaibi, Majid
    INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2021, 21 (09): : 281 - 291