Cross-lingual Transfer for Text Classification with Dictionary-based Heterogeneous Graph

被引:0
作者
Chairatanakul, Nuttapong [1 ,2 ]
Sriwatanasakdi, Noppayut [3 ]
Charoenphakdee, Nontawat [4 ,5 ]
Liu, Xin [2 ,6 ]
Murata, Tsuyoshi [1 ,2 ]
机构
[1] Tokyo Inst Technol, Meguro, Japan
[2] AIST, RWBC Oil, Tokyo, Japan
[3] Asur Japan Holdings GK, Roppongi, Japan
[4] Univ Tokyo, Bunkyo, Japan
[5] RIKEN AIP, Chuo, Japan
[6] AIRC, AIST, Tokyo, Japan
来源
FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021 | 2021年
关键词
NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In cross-lingual text classification, it is required that task-specific training data in high-resource source languages are available, where the task is identical to that of a low-resource target language. However, collecting such training data can be infeasible because of the labeling cost, task characteristics, and privacy concerns. This paper proposes an alternative solution that uses only task-independent word embeddings of high-resource languages and bilingual dictionaries. First, we construct a dictionary-based heterogeneous graph (DHG) from bilingual dictionaries. This opens the possibility to use graph neural networks for cross-lingual transfer. The remaining challenge is the heterogeneity of DHG because multiple languages are considered. To address this challenge, we propose dictionary-based heterogeneous graph neural network (DHGNet) that effectively handles the heterogeneity of DHG by two-step aggregations, which are word-level and language-level aggregations. Experimental results demonstrate that our method outperforms pretrained models even though it does not access to large corpora. Furthermore, it can perform well even though dictionaries contain many incorrect translations. Its robustness allows the usage of a wider range of dictionaries such as an automatically constructed dictionary and crowdsourced dictionary, which are convenient for real-world applications.
引用
收藏
页码:1504 / 1517
页数:14
相关论文
共 82 条
  • [21] Gori M, 2005, IEEE IJCNN, P729
  • [22] Grave Edouard, 2019, PR MACH LEARN RES, V89
  • [23] Haffari Reza, 2018, P WORKSHOP DEEP LEAR, P1025
  • [24] Hamilton WL, 2017, ADV NEUR IN, V30
  • [25] Fastai: A Layered API for Deep Learning
    Howard, Jeremy
    Gugger, Sylvain
    [J]. INFORMATION, 2020, 11 (02)
  • [26] Howard J, 2018, PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, P328
  • [27] Heterogeneous Graph Transformer
    Hu, Ziniu
    Dong, Yuxiao
    Wang, Kuansan
    Sun, Yizhou
    [J]. WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 2704 - 2710
  • [28] Joulin A, 2018, 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), P2979
  • [29] Kakwani D, 2020, FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, P4948
  • [30] Kamholz D, 2014, LREC 2014 - NINTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, P3145