Reaching Quality and Efficiency with a Parameter-Efficient Controllable Sentence Simplification Approach

被引:0
|
作者
Menta, Antonio [1 ]
Garcia-Serrano, Ana [1 ]
机构
[1] UNED, ETSI Informat, C de Juan del Rosal 14, Madrid 28040, Spain
关键词
Text Simplification; Transfer Learning; Language Models;
D O I
10.2298/CSIS230912017M
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The task of Automatic Text Simplification (ATS) aims to transform texts to improve their readability and comprehensibility. Current solutions are based on Large Language Models (LLM). These models have high performance but require powerful computing resources and large amounts of data to be fine-tuned when working in specific and technical domains. This prevents most researchers from adapting the models to their area of study. The main contributions of this research are as follows: (1) proposing an accurate solution when powerful resources are not available, using the transfer learning capabilities across different domains with a set of linguistic features using a reduced size pre-trained language model (T5-small) and making it accessible to a broader range of researchers and individuals; (2) the evaluation of our model on two well-known datasets, Turkcorpus and ASSET, and the analysis of the influence of control tokens on the SimpleText corpus, focusing on the domains of Computer Science and Medicine. Finally, a detailed discussion comparing our approach with state-of-the-art models for sentence simplification is included.
引用
收藏
页码:899 / 921
页数:24
相关论文
empty
未找到相关数据