Cross-lingual Transfer for Text Classification with Dictionary-based Heterogeneous Graph

被引:0
作者
Chairatanakul, Nuttapong [1 ,2 ]
Sriwatanasakdi, Noppayut [3 ]
Charoenphakdee, Nontawat [4 ,5 ]
Liu, Xin [2 ,6 ]
Murata, Tsuyoshi [1 ,2 ]
机构
[1] Tokyo Inst Technol, Meguro, Japan
[2] AIST, RWBC Oil, Tokyo, Japan
[3] Asur Japan Holdings GK, Roppongi, Japan
[4] Univ Tokyo, Bunkyo, Japan
[5] RIKEN AIP, Chuo, Japan
[6] AIRC, AIST, Tokyo, Japan
来源
FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021 | 2021年
关键词
NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In cross-lingual text classification, it is required that task-specific training data in high-resource source languages are available, where the task is identical to that of a low-resource target language. However, collecting such training data can be infeasible because of the labeling cost, task characteristics, and privacy concerns. This paper proposes an alternative solution that uses only task-independent word embeddings of high-resource languages and bilingual dictionaries. First, we construct a dictionary-based heterogeneous graph (DHG) from bilingual dictionaries. This opens the possibility to use graph neural networks for cross-lingual transfer. The remaining challenge is the heterogeneity of DHG because multiple languages are considered. To address this challenge, we propose dictionary-based heterogeneous graph neural network (DHGNet) that effectively handles the heterogeneity of DHG by two-step aggregations, which are word-level and language-level aggregations. Experimental results demonstrate that our method outperforms pretrained models even though it does not access to large corpora. Furthermore, it can perform well even though dictionaries contain many incorrect translations. Its robustness allows the usage of a wider range of dictionaries such as an automatically constructed dictionary and crowdsourced dictionary, which are convenient for real-world applications.
引用
收藏
页码:1504 / 1517
页数:14
相关论文
共 82 条
  • [1] [Anonymous], 2016, Neural Information Processing System
  • [2] Arora Gaurav, 2020, P 2 WORKSHOP NLP OPE, P66
  • [3] Artetxe M, 2018, AAAI CONF ARTIF INTE, P5012
  • [4] Bel N, 2003, LECT NOTES COMPUT SC, V2769, P126
  • [5] Bojanowski P., 2017, T ASSOC COMPUT LING, V5, P135, DOI [10.1162/tacl_a_00051, DOI 10.1162/TACLA00051]
  • [6] PGRA: Projected graph relation-feature attention network for heterogeneous information network embedding
    Chairatanakul, Nuttapong
    Liu, Xin
    Murata, Tsuyoshi
    [J]. INFORMATION SCIENCES, 2021, 570 (570) : 769 - 794
  • [7] Chen M, 2020, PR MACH LEARN RES, V119
  • [8] Domain Adaptation in the Absence of Source Domain Data
    Chidlovskii, Boris
    Clinchant, Stephane
    Csurka, Gabriela
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 451 - 460
  • [9] Choe Yo Joong, 2020, P 12 INT C LANGUAGE
  • [10] Cieri C, 2016, LREC 2016 - TENTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, P4543