Advanced Generative AI Methods for Academic Text Summarization

被引:0
作者
Dar, Zaema [1 ]
Raheel, Muhammad [1 ]
Bokhari, Usman [1 ]
Jamil, Akhtar [1 ]
Alazawi, Esraa Mohammed [2 ]
Hameed, Alaa Ali [3 ]
机构
[1] Natl Univ Comp & Emerging Sci, Dept Comp Sci, Islamabad, Pakistan
[2] Univ Baghdad, Dept Comp Engn, Baghdad, Iraq
[3] Istinye Univ, Dept Comp Engn, Istanbul, Turkiye
来源
2024 IEEE 3RD INTERNATIONAL CONFERENCE ON COMPUTING AND MACHINE INTELLIGENCE, ICMI 2024 | 2024年
关键词
Natural Language Processing; Scientific Summarization; Transformers; LED Large; Pegasus-Large; BART; SciBERT; Literature Review Generation; Cosine Similarity; Deep Learning;
D O I
10.1109/ICMI60790.2024.10585622
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The exponential growth of scientific literature emphasizes the need for employing advanced techniques for effective text summarization, which can significantly speed up the research process. This study tackles the challenge by advancing scientific text summarization through AI and deep learning methods. We delve into the integration and fine-tuning of cutting-edge models, including LED Large, Pegasus variants, and BART, aiming to refine the summarization process. Unique combinations, such as SciBERT with LED Large, were investigated to ensure the capture of critical details frequently missed by traditional methods. This novel approach led to notable improvements in summarization effectiveness. Our findings indicate that models like LED Large excel in quickly adapting to training data, achieving impressive semantic understanding with fewer training epochs, evidenced by achieving a FRES score of 28.5852 and ROUGE scores, including a ROUGE-1 F1-Score of 0.4991. However, while extensively trained models like BART large and Pegasus displayed strong semantic capabilities, they also pointed to the necessity for refinements in readability and higher-order n-gram overlap in the produced summaries.
引用
收藏
页数:7
相关论文
共 16 条
  • [1] Badhe S., 2023, 2023 4 INT C EM TECH, P1
  • [2] Beltagy I, 2020, Arxiv, DOI arXiv:2004.05150
  • [3] Gera Ariel, 2022, arXiv
  • [4] Girthana K., 2023, 2023 International Conference on Artificial Intelligence and Smart Communication (AISC), P298, DOI 10.1109/AISC56616.2023.10085409
  • [5] Guo MY, 2022, Arxiv, DOI arXiv:2112.07916
  • [6] DeepSumm: Exploiting topic models and sequence to sequence networks for extractive text summarization
    Joshi, Akanksha
    Fidalgo, Eduardo
    Alegre, Enrique
    Fernandez-Robles, Laura
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2023, 211
  • [7] Kryscinski W, 2021, Arxiv, DOI arXiv:2105.08209
  • [8] A Combined Extractive With Abstractive Model for Summarization
    Liu, Wenfeng
    Gao, Yaling
    Li, Jinming
    Yang, Yuzhen
    [J]. IEEE ACCESS, 2021, 9 : 43970 - 43980
  • [9] Shen Xin, 2022, 2022 4th International Conference on Natural Language Processing (ICNLP), P395, DOI 10.1109/ICNLP55136.2022.00073
  • [10] A multi-view extractive text summarization approach for long scientific articles
    Souza, Cinthia M.
    Meireles, Magali R. G.
    Vimieiro, Renato
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,