SPOKEN LANGUAGE UNDERSTANDING USING LONG SHORT-TERM MEMORY NEURAL NETWORKS

被引:0
|
作者
Yao, Kaisheng [1 ]
Peng, Baolin [1 ]
Zhang, Yu [1 ]
Yu, Dong [1 ]
Zweig, Geoffrey [1 ]
Shi, Yangyang [1 ]
机构
[1] Microsoft, Redmond, WA 98052 USA
关键词
Recurrent neural networks; long short-term memory; language understanding; ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural network based approaches have recently produced record-setting performances in natural language understanding tasks such as word labeling. In the word labeling task, a tagger is used to assign a label to each word in an input sequence. Specifically, simple recurrent neural networks (RNNs) and convolutional neural networks (CNNs) have shown to significantly outperform the previous state-of-the-art - conditional random fields (CRFs). This paper investigates using long short-term memory (LSTM) neural networks, which contain input, output and forgetting gates and are more advanced than simple RNN, for the word labeling task. To explicitly model output-label dependence, we propose a regression model on top of the LSTM un-normalized scores. We also propose to apply deep LSTM to the task. We investigated the relative importance of each gate in the LSTM by setting other gates to a constant and only learning particular gates. Experiments on the ATIS dataset validated the effectiveness of the proposed models.
引用
收藏
页码:189 / 194
页数:6
相关论文
共 50 条
  • [41] Lower Limb Kinematics Trajectory Prediction Using Long Short-Term Memory Neural Networks
    Zaroug, Abdelrahman
    Lei, Daniel T. H.
    Mudie, Kurt
    Begg, Rezaul
    FRONTIERS IN BIOENGINEERING AND BIOTECHNOLOGY, 2020, 8 (08):
  • [42] Automatic Cause Inference of Construction Accident Using Long Short-Term Memory Neural Networks
    Wu, Hengqin
    Shen, Geoffrey Qiping
    Zhou, Zhenzong
    Li, Wenpeng
    Li, Xin
    CARBON PEAK AND NEUTRALITY STRATEGIES OF THE CONSTRUCTION INDUSTRY (ICCREM 2022), 2022, : 269 - 275
  • [43] Missing Precipitation Data Estimation Using Long Short-Term Memory Deep Neural Networks
    Djerbouai, Salim
    JOURNAL OF ECOLOGICAL ENGINEERING, 2022, 23 (05): : 216 - 225
  • [44] Intrapartum Fetal-State Classification using Long Short-Term Memory Neural Networks
    Warrick, Philip A.
    Hamilton, Emily F.
    2017 COMPUTING IN CARDIOLOGY (CINC), 2017, 44
  • [45] GRAPHEME-TO-PHONEME CONVERSION USING LONG SHORT-TERM MEMORY RECURRENT NEURAL NETWORKS
    Rao, Kanishka
    Peng, Fuchun
    Sak, Hasim
    Beaufays, Francoise
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 4225 - 4229
  • [46] Combining fuzzy clustering and improved long short-term memory neural networks for short-term load forecasting
    Liu, Fu
    Dong, Tian
    Liu, Qiaoliang
    Liu, Yun
    Li, Shoutao
    ELECTRIC POWER SYSTEMS RESEARCH, 2024, 226
  • [47] On the Initialization of Long Short-Term Memory Networks
    Ghazi, Mostafa Mehdipour
    Nielsen, Mads
    Pai, Akshay
    Modat, Marc
    Cardoso, M. Jorge
    Ourselin, Sebastien
    Sorensen, Lauge
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT I, 2019, 11953 : 275 - 286
  • [48] Evolving Long Short-Term Memory Networks
    Neto, Vicente Coelho Lobo
    Passos, Leandro Aparecido
    Papa, Joao Paulo
    COMPUTATIONAL SCIENCE - ICCS 2020, PT II, 2020, 12138 : 337 - 350
  • [49] CONTEXTUAL SPOKEN LANGUAGE UNDERSTANDING USING RECURRENT NEURAL NETWORKS
    Shi, Yangyang
    Yao, Kaisheng
    Chen, Hu
    Pan, Yi-Cheng
    Hwang, Mei-Yuh
    Peng, Baolin
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 5271 - 5275
  • [50] Short-term memory circuit using hardware ring neural networks
    Sasano, Naoya
    Saeki, Katsutoshi
    Sekine, Yoshifumi
    ARTIFICIAL LIFE AND ROBOTICS, 2005, 9 (02) : 81 - 85