Document Sub-structure in Neural Machine Translation

被引:0
作者
Dobreva, Radina [1 ]
Zhou, Jie [2 ]
Bawden, Rachel [1 ]
机构
[1] Univ Edinburgh, Sch Informat, 10 Crichton St, Edinburgh, Midlothian, Scotland
[2] Alibaba Grp, 969 West Wen Yi Rd, Hangzhou, Peoples R China
来源
PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020) | 2020年
基金
英国工程与自然科学研究理事会;
关键词
machine translation; document structure; corpus creation; context; Wikipedia; parallel corpus;
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Current approaches to machine translation (MT) either translate sentences in isolation, disregarding the context they appear in, or model context at the level of the full document, without a notion of any internal structure the document may have. In this work we consider the fact that documents are rarely homogeneous blocks of text, but rather consist of parts covering different topics. Some documents, such as biographies and encyclopedia entries, have highly predictable, regular structures in which sections are characterised by different topics. We draw inspiration from Louis and Webber (2014) who use this information to improve statistical MT and transfer their proposal into the framework of neural MT. We compare two different methods of including information about the topic of the section within which each sentence is found: one using side constraints and the other using a cache-based model. We create and release the data on which we run our experiments - parallel corpora for three language pairs (Chinese-English, French-English, Bulgarian-English) from Wikipedia biographies, which we extract automatically, preserving the boundaries of sections within the articles.
引用
收藏
页码:3657 / 3667
页数:11
相关论文
共 27 条
[1]  
[Anonymous], 2007, P ANN M ASS COMPUTAT
[2]  
[Anonymous], 2014, P 14 C EUR CHAPT ASS
[3]  
Barrault Loic, 2019, 4 C MACHINE TRANSLAT, V2, P1
[4]  
Bawden R., 2018, THESIS
[5]   Latent Dirichlet allocation [J].
Blei, DM ;
Ng, AY ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) :993-1022
[6]  
Caswell I, 2019, FOURTH CONFERENCE ON MACHINE TRANSLATION (WMT 2019), VOL 1: RESEARCH PAPERS, P53
[7]  
Conneau A, 2019, ADV NEUR IN, V32
[8]  
Graham Y., 2019, ABS190609833 CORR
[9]  
Hardmeier C., 2014, PhD thesis
[10]  
Kobus C., 2017, P INT C RECENT ADV N, P372