Unsupervised Domain Adaptation for Sentimental Classification by Word Embeddings on the Lower Layer of BERT

被引:0
作者
Bai, Jing [1 ]
Tanaka, Hirotaka [1 ]
Cao, Rui [1 ]
Ma, Wen [1 ]
Shinnou, Hiroyuki [1 ]
机构
[1] Ibaraki Univ, Grad Sch Sci & Engn, Comp & Informat Sci, Hitachi, Ibaraki, Japan
来源
2019 INTERNATIONAL CONFERENCE ON TECHNOLOGIES AND APPLICATIONS OF ARTIFICIAL INTELLIGENCE (TAAI) | 2019年
关键词
D O I
10.1109/taai48200.2019.8959881
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bidirectional Encoder Representations from Transformers (BERT) is a stacked (12 or 24)model of multi-head attention applied in the transformers. The multi-head attention of each layer outputs a word embedded expression sequence corresponding to the input word sequence. When BERT is applied to the feature base, the output is the word-embedded expression column of the highest layer used in each task. On the other hand, in domain adaptation, projecting the data of each region onto the common subspace of the source and target domains is an effective approach. When constructing a feature vector on a common subspace from a word-embedded representation output of the BERT, the most significant layer depends on the task learning task assigned to BERT, so is not necessarily more significant than the word-embedded representation of a lower layer. Layers are suboptimal for regional adaptation. Here we confirm this concept on unsupervised domain adaptation of an emotion analysis.
引用
收藏
页数:6
相关论文
共 12 条
  • [1] [Anonymous], PACLIC32
  • [2] [Anonymous], NATURAL LANGUAGE PRO
  • [3] [Anonymous], 2013, SEMISUPERVISED LEARN
  • [4] [Anonymous], 2017, J NATURAL LANGUAGE P
  • [5] Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
  • [6] Howard J, 2018, PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, P328
  • [7] Lau J. H., 2016, arXiv preprint arXiv:1607.05368
  • [8] A Survey on Transfer Learning
    Pan, Sinno Jialin
    Yang, Qiang
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2010, 22 (10) : 1345 - 1359
  • [9] Radford A., 2018, IMPROVING LANGUAGE U, P1
  • [10] Radford Alec, 2018, Improving language understanding by generative pre-training