Sense representations for Portuguese: experiments with sense embeddings and deep neural language models

被引:0
作者
Jéssica Rodrigues da Silva
Helena de M. Caseli
机构
[1] Federal University of São Carlos (UFSCar),
来源
Language Resources and Evaluation | 2021年 / 55卷
关键词
Sense embeddings; Deep neural language models; Word sense disambiguation; Word embeddings; Portuguese;
D O I
暂无
中图分类号
学科分类号
摘要
Sense representations have gone beyond word representations like Word2Vec, GloVe and FastText and achieved innovative performance on a wide range of natural language processing tasks. Although very useful in many applications, the traditional approaches for generating word embeddings have a strict drawback: they produce a single vector representation for a given word ignoring the fact that ambiguous words can assume different meanings. In this paper, we explore unsupervised sense representations which, different from traditional word embeddings, are able to induce different senses of a word by analyzing its contextual semantics in a text. The unsupervised sense representations investigated in this paper are: sense embeddings and deep neural language models. We present the first experiments carried out for generating sense embeddings for Portuguese. Our experiments show that the sense embedding model (Sense2vec) outperformed traditional word embeddings in syntactic and semantic analogies task, proving that the language resource generated here can improve the performance of NLP tasks in Portuguese. We also evaluated the performance of pre-trained deep neural language models (ELMo and BERT) in two transfer learning approaches: feature based and fine-tuning, in the semantic textual similarity task. Our experiments indicate that the fine tuned Multilingual and Portuguese BERT language models were able to achieve better accuracy than the ELMo model and baselines.
引用
收藏
页码:901 / 924
页数:23
相关论文
共 13 条
[1]  
Bojanowski P(2017)Enriching word vectors with subword information Transactions of the Association for Computational Linguistics 5 135-146
[2]  
Grave E(2018)From word to sense embeddings: A survey on vector representations of meaning Journal of Artificial Intelligence Research 63 743-788
[3]  
Joulin A(1954)Semi-supervised sequence modeling with syntactic topic models Distributional structure. Word 10 146-162
[4]  
Mikolov T(2005)A vector space model for automatic indexing AAAI 5 813-818
[5]  
Camacho-Collados J(1975)Automatic word sense discrimination Communications of the ACM 18 613-620
[6]  
Pilehvar MT(1998)undefined Computational Linguistics 24 97-123
[7]  
Harris ZS(undefined)undefined undefined undefined undefined-undefined
[8]  
Li W(undefined)undefined undefined undefined undefined-undefined
[9]  
McCallum A(undefined)undefined undefined undefined undefined-undefined
[10]  
Salton G(undefined)undefined undefined undefined undefined-undefined