SimpLex: a lexical text simplification architecture

被引:0
作者
Ciprian-Octavian Truică
Andrei-Ionuţ Stan
Elena-Simona Apostol
机构
[1] Uppsala University,Department of Information Technology
[2] University Politehnica of Bucharest,Computer Science and Engineering Department, Faculty of Automatic Control and Computers
来源
Neural Computing and Applications | 2023年 / 35卷
关键词
Text simplification; Complexity prediction; Transformers; Word embeddings; Perplexity;
D O I
暂无
中图分类号
学科分类号
摘要
Text simplification (TS) is the process of generating easy-to-understand sentences from a given sentence or piece of text. The aim of TS is to reduce both the lexical (which refers to vocabulary complexity and meaning) and syntactic (which refers to the sentence structure) complexity of a given text or sentence without the loss of meaning or nuance. In this paper, we present SimpLex, a novel simplification architecture for generating simplified English sentences. To generate a simplified sentence, the proposed architecture uses either word embeddings (i.e., Word2Vec) and perplexity, or sentence transformers (i.e., BERT, RoBERTa, and GPT2) and cosine similarity. The solution is incorporated into a user-friendly and simple-to-use software. We evaluate our system using two metrics, i.e., SARI and Perplexity Decrease. Experimentally, we observe that the transformer models outperform the other models in terms of the SARI score. However, in terms of perplexity, the word embedding-based models achieve the biggest decrease. Thus, the main contributions of this paper are: (1) We propose a new word embedding and transformer-based algorithm for text simplification; (2) we design SimpLex—a modular novel text simplification system—that can provide a baseline for further research; and (3) we perform an in-depth analysis of our solution and compare our results with two state-of-the-art models, i.e., LightLS as reported by Glavaš and Štajner (in: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing, 2015) and NTS-w2v as reported by Nisioi et al. (in: Proceedings of the 55th annual meeting of the association for computational linguistics, 2017). We also make the code publicly available online.
引用
收藏
页码:6265 / 6280
页数:15
相关论文
共 78 条
[1]  
Al-Thanyyan SS(2022)Automated text simplification: a survey ACM Comput Surv 54 1-36
[2]  
Azmi AM(2021)Lexical simplification system to improve web accessibility IEEE Access 9 58755-58767
[3]  
Alarcon R(2020)Data-driven sentence simplification: survey and benchmark Comput Linguist 46 135-187
[4]  
Moreno L(2005)Minimal recursion semantics: an introduction Res Lang Comput 3 281-332
[5]  
Martinez P(1995)Support-vector networks Mach Learn 20 273-297
[6]  
Alva-Manchego F(2022)Neural natural language generation: a survey on multilinguality, multimodality, controllability and learning J Artif Intell Res 73 1131-1207
[7]  
Scarton C(2022)Evaluation of split-and-rephrase output of the knowledge extraction tool in the intelligent tutoring system Expert Syst Appl 187 115900-1780
[8]  
Specia L(1997)Long short-term memory Neural Comput 9 1735-41
[9]  
Copestake A(2014)Docker: lightweight linux containers for consistent development and deployment Linux J 239 2-2830
[10]  
Flickinger D(1995)Wordnet: a lexical database for english Commun ACM 38 39-3076