Towards Robust Named Entity Recognition for Historic German

被引:0
作者
Schweter, Stefan [1 ]
Baiter, Johannes [1 ]
机构
[1] Bayer Staatsbibliothek Miinchen, Digital Lib, Munich Digitizat Ctr, D-80539 Munich, Germany
来源
4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019) | 2019年
关键词
D O I
暂无
中图分类号
H0 [语言学];
学科分类号
030303 ; 0501 ; 050102 ;
摘要
Recent advances in language modeling using deep neural networks have shown that these models learn representations, that vary with the network depth from morphology to semantic relationships like co-reference. We apply pre-trained language models to low-resource named entity recognition for Historic German. We show on a series of experiments that character-based pre-trained language models do not run into trouble when faced with low-resource datasets. Our pre-trained character-based language models improve upon classical CRF-based methods and previous work on Bi-LSTMs by boosting F1 score performance by up to 6%. Our pre-trained language and NER models are publicly available.
引用
收藏
页码:96 / 103
页数:8
相关论文
共 19 条
  • [1] AKBIK A, 2018, P 27 INT C COMP LING, P1638
  • [2] [Anonymous], 2016, P 10 INT C LANG RES
  • [3] [Anonymous], 2014, P 9 INT C LANG RES E
  • [4] [Anonymous], PROC 7 C NATURAL LAN, DOI DOI 10.48550/ARXIV.CS/0306050
  • [5] [Anonymous], 2012, P 8 INT C LANG RES E
  • [6] [Anonymous], 2018, C EMPIRICAL METHODS
  • [7] [Anonymous], 6 INT C LEARN REPR
  • [8] Cotterell R., 2017, P 8 INT JOINT C NAT, V2, P91
  • [9] Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
  • [10] Huang Z., 2015, arXiv