Cross-Lingual Knowledge Transferring by Structural Correspondence and Space Transfer

被引:3
作者
Wang, Deqing [1 ]
Wu, Junjie [2 ,3 ,4 ]
Yang, Jingyuan [5 ]
Jing, Baoyu [6 ]
Zhang, Wenjie [1 ]
He, Xiaonan [7 ]
Zhang, Hui [1 ]
机构
[1] Beihang Univ, Sch Comp Sci, Beijing 100191, Peoples R China
[2] Beihang Univ, Sch Econ & Management, Beijing 100191, Peoples R China
[3] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing 100191, Peoples R China
[4] Beihang Univ, Beijing Key Lab Emergency Support Simulat Technol, Beijing 100191, Peoples R China
[5] George Mason Univ, Sch Business, Fairfax, VA 22030 USA
[6] Univ Illinois, Dept Comp Sci, Champaign, IL 61801 USA
[7] Baidu Inc, Dept Search, Beijing 100094, Peoples R China
关键词
Task analysis; Machine translation; Analytical models; Transfer learning; Dictionaries; Electronic mail; Time complexity; Cross-lingual sentiment classification; space transfer; structural correspondence learning (SCL); SENTIMENT CLASSIFICATION;
D O I
10.1109/TCYB.2021.3051005
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The cross-lingual sentiment analysis (CLSA) aims to leverage label-rich resources in the source language to improve the models of a resource-scarce domain in the target language, where monolingual approaches based on machine learning usually suffer from the unavailability of sentiment knowledge. Recently, the transfer learning paradigm that can transfer sentiment knowledge from resource-rich languages, for example, English, to resource-poor languages, for example, Chinese, has gained particular interest. Along this line, in this article, we propose semisupervised learning with SCL and space transfer (ssSCL-ST), a semisupervised transfer learning approach that makes use of structural correspondence learning as well as space transfer for cross-lingual sentiment analysis. The key idea behind ssSCL-ST, at a high level, is to explore the intrinsic sentiment knowledge in the target-lingual domain and to reduce the loss of valuable knowledge due to the knowledge transfer via semisupervised learning. ssSCL-ST also features in pivot set extension and space transfer, which helps to enhance the efficiency of knowledge transfer and improve the classification accuracy in the target language domain. Extensive experimental results demonstrate the superiority of ssSCL-ST to the state-of-the-art approaches without using any parallel corpora.
引用
收藏
页码:6555 / 6566
页数:12
相关论文
共 50 条
  • [21] Cross-lingual sentiment classification with stacked autoencoders
    Zhou, Guangyou
    Zhu, Zhiyuan
    He, Tingting
    Hu, Xiaohua Tony
    KNOWLEDGE AND INFORMATION SYSTEMS, 2016, 47 (01) : 27 - 44
  • [22] Zero-Shot Cross-Lingual Transfer in Legal Domain Using Transformer Models
    Shaheen, Zein
    Wohlgenannt, Gerhard
    Mouromtsev, Dmitry
    2021 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI 2021), 2021, : 450 - 456
  • [23] A scalable framework for cross-lingual authorship identification
    Sarwar, Raheem
    Li, Qing
    Rakthanmanon, Thanawin
    Nutanong, Sarana
    INFORMATION SCIENCES, 2018, 465 : 323 - 339
  • [24] A Workbench for Rapid Generation of Cross-Lingual Summaries
    Jhaveri, Nisarg
    Gupta, Manish
    Varma, Vasudeva
    PROCEEDINGS OF THE ELEVENTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2018), 2018, : 3209 - 3215
  • [25] OECM: A Cross-Lingual Approach for Ontology Enrichment
    Ibrahim, Shimaa
    Fathalla, Said
    Yazdi, Hamed Shariat
    Lehmann, Jens
    Jabeen, Hajira
    SEMANTIC WEB: ESWC 2019 SATELLITE EVENTS, 2019, 11762 : 100 - 104
  • [26] Cross-lingual learning for text processing: A survey
    Pikuliak, Matus
    Simko, Marian
    Bielikova, Maria
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 165
  • [27] Cross-Lingual Argumentation Mining for Russian Texts
    Fishcheva, Irina
    Kotelnikov, Evgeny
    ANALYSIS OF IMAGES, SOCIAL NETWORKS AND TEXTS, AIST 2019, 2019, 11832 : 134 - 144
  • [28] Zero-shot cross-lingual transfer language selection using linguistic similarity
    Eronen, Juuso
    Ptaszynski, Michal
    Masui, Fumito
    INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (03)
  • [29] Transfer language selection for zero-shot cross-lingual abusive language detection
    Eronen, Juuso
    Ptaszynski, Michal
    Masui, Fumito
    Arata, Masaki
    Leliwa, Gniewosz
    Wroczynski, Michal
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (04)
  • [30] Cross-lingual offensive speech identification with transfer learning for low-resource languages
    Shi, Xiayang
    Liu, Xinyi
    Xu, Chun
    Huang, Yuanyuan
    Chen, Fang
    Zhu, Shaolin
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 101