Contrastive Aligned Joint Learning for Multilingual Summarization

被引:0
作者
Wang, Danqing [1 ]
Chen, Jiaze [1 ]
Zhou, Hao [1 ]
Qiu, Xipeng [2 ]
Li, Lei [1 ]
机构
[1] ByteDance AI Lab, Beijing, Peoples R China
[2] Fudan Univ, Shanghai, Peoples R China
来源
FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021 | 2021年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multilingual text summarization requires the ability to understand documents in multiple languages and generate summaries in the corresponding language, which poses more challenges on current summarization systems. However, this problem has been rarely studied due to the lack of large-scale supervised summarization data in multiple languages. In this paper, we first provide a large-scale multilingual summarization corpus MLGSum consisting of 1.1 million articles and summaries in 12 different languages. Based on it, we develop a unified summarization model to understand the document and generate summaries in different languages. We use the contrastive learning strategy to train our multilingual summarization system (CALMS), which consists of two training objectives, contrastive sentence ranking (CSR) and sentence aligned substitution (SAS). The two training objectives are designed to share salient information extractive ability and align sentencelevel representation across different languages. Experimental results indicate that CALMS achieves significant improvement over monolingual models in all languages. We further transfer CALMS to other languages and find that it will also benefit similar languages. Our code and dataset are available at https://github.com/brxx122/CALMS.
引用
收藏
页码:2739 / 2750
页数:12
相关论文
共 29 条
[1]  
Cao Y, 2020, AAAI CONF ARTIF INTE, V34, P11
[2]  
DanqingWang Pengfei Liu, 2020, P 58 ANN M ASS COMP, P6209
[3]  
Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
[4]   Submodular Maximization with Matroid and Packing Constraints in Parallel [J].
Ene, Alina ;
Nguyen, Huy L. ;
Vladu, Adrian .
PROCEEDINGS OF THE 51ST ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING (STOC '19), 2019, :90-101
[5]  
Grusky Max, 2018, P 2018 C N AM CHAPT, V1, P708, DOI [10.18653/v1/N18, DOI 10.18653/V1/N18-1065]
[6]  
Hermann Karl Moritz, 2015, Advances in Neural Information Processing Systems, V28
[7]  
Huang HY, 2019, 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019), P2485
[8]  
Kale Mihir, 2020, P 13 INT C NAT LANG, P97
[9]  
Kedzie C, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P1818
[10]  
King DB, 2015, ACS SYM SER, V1214, P1, DOI 10.1021/bk-2015-1214.ch001