SPOKEN LANGUAGE UNDERSTANDING USING LONG SHORT-TERM MEMORY NEURAL NETWORKS

被引:0
作者
Yao, Kaisheng [1 ]
Peng, Baolin [1 ]
Zhang, Yu [1 ]
Yu, Dong [1 ]
Zweig, Geoffrey [1 ]
Shi, Yangyang [1 ]
机构
[1] Microsoft, Redmond, WA 98052 USA
来源
2014 IEEE WORKSHOP ON SPOKEN LANGUAGE TECHNOLOGY SLT 2014 | 2014年
关键词
Recurrent neural networks; long short-term memory; language understanding; ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural network based approaches have recently produced record-setting performances in natural language understanding tasks such as word labeling. In the word labeling task, a tagger is used to assign a label to each word in an input sequence. Specifically, simple recurrent neural networks (RNNs) and convolutional neural networks (CNNs) have shown to significantly outperform the previous state-of-the-art - conditional random fields (CRFs). This paper investigates using long short-term memory (LSTM) neural networks, which contain input, output and forgetting gates and are more advanced than simple RNN, for the word labeling task. To explicitly model output-label dependence, we propose a regression model on top of the LSTM un-normalized scores. We also propose to apply deep LSTM to the task. We investigated the relative importance of each gate in the LSTM by setting other gates to a constant and only learning particular gates. Experiments on the ATIS dataset validated the effectiveness of the proposed models.
引用
收藏
页码:189 / 194
页数:6
相关论文
共 50 条
  • [1] Recognition of Sign Language System for Indonesian Language Using Long Short-Term Memory Neural Networks
    Rakun, Erdefi
    Arymurthy, Aniati M.
    Stefanus, Lim Y.
    Wicaksono, Alfan F.
    Wisesa, I. Wayan W.
    ADVANCED SCIENCE LETTERS, 2018, 24 (02) : 999 - 1004
  • [2] Dialog State Tracking Using Long Short-term Memory Neural Networks
    Yang, Xiaohao
    Liu, Jia
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 1800 - 1804
  • [3] LATE REVERBERATION SUPPRESSION USING RECURRENT NEURAL NETWORKS WITH LONG SHORT-TERM MEMORY
    Zhao, Yan
    Wang, DeLiang
    Xu, Buye
    Zhang, Tao
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 5434 - 5438
  • [4] Industrial Financial Forecasting using Long Short-Term Memory Recurrent Neural Networks
    Ali, Muhammad Mohsin
    Babar, Muhammad Imran
    Hamza, Muhammad
    Jehanzeb, Muhammad
    Habib, Saad
    Khan, Muhammad Sajid
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2019, 10 (04) : 88 - 99
  • [5] An Incremental Learning Approach Using Long Short-Term Memory Neural Networks
    Lemos Neto, Alvaro C.
    Coelho, Rodrigo A.
    de Castro, Cristiano L.
    JOURNAL OF CONTROL AUTOMATION AND ELECTRICAL SYSTEMS, 2022, 33 (05) : 1457 - 1465
  • [6] An Incremental Learning Approach Using Long Short-Term Memory Neural Networks
    Álvaro C. Lemos Neto
    Rodrigo A. Coelho
    Cristiano L. de Castro
    Journal of Control, Automation and Electrical Systems, 2022, 33 : 1457 - 1465
  • [7] Using Ant Colony Optimization to Optimize Long Short-Term Memory Recurrent Neural Networks
    ElSaid, AbdElRahman
    El Jamiy, Fatima
    Higgins, James
    Wild, Brandon
    Desell, Travis
    GECCO'18: PROCEEDINGS OF THE 2018 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2018, : 13 - 20
  • [8] Empirical modeling of ethanol production dynamics using long short-term memory recurrent neural networks
    Sousa F.M.M.
    Fonseca R.R.
    da Silva F.V.
    Bioresource Technology Reports, 2021, 15
  • [9] Modelling energy demand response using long short-term memory neural networks
    JoséJoaquìn Mesa Jiménez
    Lee Stokes
    Chris Moss
    Qingping Yang
    Valerie N. Livina
    Energy Efficiency, 2020, 13 : 1263 - 1280
  • [10] Early Forecasting of Rice Blast Disease Using Long Short-Term Memory Recurrent Neural Networks
    Kim, Yangseon
    Roh, Jae-Hwan
    Kim, Ha Young
    SUSTAINABILITY, 2018, 10 (01)