RECURRENT NEURAL NETWORK LANGUAGE MODEL TRAINING WITH NOISE CONTRASTIVE ESTIMATION FOR SPEECH RECOGNITION

被引:0
作者
Chen, X. [1 ]
Liu, X. [1 ]
Gales, M. J. E. [1 ]
Woodland, P. C. [1 ]
机构
[1] Univ Cambridge, Engn Dept, Trumpington St, Cambridge CB2 1PZ, England
来源
2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP) | 2015年
基金
英国工程与自然科学研究理事会;
关键词
language model; recurrent neural network; GPU; noise contrastive estimation; speech recognition;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
In recent years recurrent neural network language models (RNNLMs) have been successfully applied to a range of tasks including speech recognition. However, an important issue that limits the quantity of data used, and their possible application areas, is the computational cost in training. A significant part of this cost is associated with the softmax function at the output layer, as this requires a normalization term to be explicitly calculated. This impacts both the training and testing speed, especially when a large output vocabulary is used. To address this problem, noise contrastive estimation (NCE) is explored in RNNLM training. NCE does not require the above normalization during both training and testing. It is insensitive to the output layer size. On a large vocabulary conversational telephone speech recognition task, a doubling in training speed on a GPU and a 56 times speed up in test time evaluation on a CPU were obtained.
引用
收藏
页码:5411 / 5415
页数:5
相关论文
共 29 条
  • [1] [Anonymous], 1988, LEARNING REPRESENTAT
  • [2] [Anonymous], 2013, P ANN C INT SPEECH C
  • [3] [Anonymous], 2013, TECH REP
  • [4] [Anonymous], P ISCA INT
  • [5] [Anonymous], 2012, P 29 INT C MACH LEAR
  • [6] Auli Michael, 2013, P 2013 C EMPIRICAL M, P1044
  • [7] Bulyko Ivan, P HLT NAACL STROUDSB, V2, P7
  • [8] Chen Xie, 2015, P IEEE ICASSP
  • [9] Deoras A, 2011, INT CONF ACOUST SPEE, P5532
  • [10] Devlin J, 2014, PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, P1370