Unifying Cross-lingual Summarization and Machine Translation with Compression Rate

被引:5
作者
Bai, Yu [1 ,2 ]
Huang, Heyan [1 ,3 ]
Fan, Kai [4 ]
Gao, Yang [1 ]
Zhu, Yiming [1 ]
Zhan, Jiaao [1 ]
Chi, Zewen [1 ]
Chen, Boxing [4 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci, Beijing, Peoples R China
[2] Beijing Engn Res Ctr High Volume Language Informa, Beijing, Peoples R China
[3] Southeast Acad Informat Technol, Putian, Fujian, Peoples R China
[4] Alibaba DAMO Acad, Machine Intelligence Technol Lab, Hangzhou, Peoples R China
来源
PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22) | 2022年
基金
中国国家自然科学基金;
关键词
Cross-lingual Summarization; Machine Translation; Compression Rate;
D O I
10.1145/3477495.3532071
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Cross-Lingual Summarization (CLS) is a task that extracts important information from a source document and summarizes it into a summary in another language. It is a challenging task that requires a system to understand, summarize, and translate at the same time, making it highly related to Monolingual Summarization (MS) and Machine Translation (MT). In practice, the training resources for Machine Translation are far more than that for cross-lingual and monolingual summarization. Thus incorporating the Machine Translation corpus into CLS would be beneficial for its performance. However, the present work only leverages a simple multi-task framework to bring Machine Translation in, lacking deeper exploration. In this paper, we propose a novel task, Cross-lingual Summarization with Compression rate (CSC), to benefit Cross-Lingual Summarization by large-scale Machine Translation corpus. Through introducing compression rate, the information ratio between the source and the target text, we regard the MT task as a special CLS task with a compression rate of 100%. Hence they can be trained as a unified task, sharing knowledge more effectively. However, a huge gap exists between the MT task and the CLS task, where samples with compression rates between 30% and 90% are extremely rare. Hence, to bridge these two tasks smoothly, we propose an effective data augmentation method to produce document-summary pairs with different compression rates. The proposed method not only improves the performance of the CLS task, but also provides controllability to generate summaries in desired lengths. Experiments demonstrate that our method outperforms various strong baselines in three cross-lingual summarization datasets. We released our code and data at https://github.com/ybai-nlp/CLS_CIR.
引用
收藏
页码:1087 / 1097
页数:11
相关论文
共 50 条
[41]   Evaluating Cross-lingual Semantic Annotation for Medical Forms [J].
Lin, Ying-Chi ;
Christen, Victor ;
Gross, Anika ;
Kirsten, Toralf ;
Cardoso, Silvio Domingos ;
Pruski, Cedric ;
Da Silveira, Marcos ;
Rahm, Erhard .
PROCEEDINGS OF THE 13TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES, VOL 5: HEALTHINF, 2020, :145-155
[42]   Towards cross-lingual alerting for bursty epidemic events [J].
Collier N. .
Journal of Biomedical Semantics, 2 (Suppl 5)
[43]   Experiments in Cross-Lingual Sentiment Analysis in Discussion Forums [J].
Ghorbel, Hatem .
SOCIAL INFORMATICS, SOCINFO 2012, 2012, 7710 :138-151
[44]   Understanding Cross-lingual Pragmatic Misunderstandings in Email Communication [J].
Lim H. ;
Cosley D. ;
Fussell S.R. .
Proceedings of the ACM on Human-Computer Interaction, 2022, 6 (CSCW1)
[45]   Evaluation of Cross-Lingual Bug Localization: Two Industrial Cases [J].
Hayashi, Shinpei ;
Kobayashi, Takashi ;
Kato, Tadahisa .
2023 IEEE INTERNATIONAL CONFERENCE ON SOFTWARE MAINTENANCE AND EVOLUTION, ICSME, 2023, :495-499
[46]   Manipuri-English comparable corpus for cross-lingual studies [J].
Laitonjam, Lenin ;
Singh, Sanasam Ranbir .
LANGUAGE RESOURCES AND EVALUATION, 2023, 57 (01) :377-413
[47]   Cross-Lingual Knowledge Transferring by Structural Correspondence and Space Transfer [J].
Wang, Deqing ;
Wu, Junjie ;
Yang, Jingyuan ;
Jing, Baoyu ;
Zhang, Wenjie ;
He, Xiaonan ;
Zhang, Hui .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (07) :6555-6566
[48]   Cross-Lingual Cross-Modal Retrieval with Noise-Robust Learning [J].
Wang, Yabing ;
Dong, Jianfeng ;
Liang, Tianxiang ;
Zhang, Minsong ;
Cai, Rui ;
Wang, Xun .
PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022,
[49]   picoTrans: An Intelligent Icon-Driven Interface for Cross-Lingual Communication [J].
Song, Wei ;
Finch, Andrew ;
Tanaka-Ishii, Kumiko ;
Yasuda, Keiji ;
Sumita, Eiichiro .
ACM TRANSACTIONS ON INTERACTIVE INTELLIGENT SYSTEMS, 2013, 3 (01)
[50]   An end-to-end model for cross-lingual transformation of paralinguistic information [J].
Kano, Takatomo ;
Takamichi, Shinnosuke ;
Sakti, Sakriani ;
Neubig, Graham ;
Toda, Tomoki ;
Nakamura, Satoshi .
MACHINE TRANSLATION, 2018, 32 (04) :353-368