Label-dependency coding in Simple Recurrent Networks for Spoken Language Understanding

被引:6
作者
Dinarelli, Marco [1 ]
Vukotic, Vedran [2 ,3 ]
Raymond, Christian [2 ,3 ]
机构
[1] USPC Univ Sorbonne Paris Cite, PSL Res Univ, Univ Sorbonne Nouvelle Paris 3, Lattice,CNRS,ENS Paris, Paris, France
[2] INSA Rennes, Rennes, France
[3] IRISA, INRIA, Rennes, France
来源
18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION | 2017年
关键词
recurrent neural networks; label dependencies; spoken language understanding; slot filling; ATIS; MEDIA;
D O I
10.21437/Interspeech.2017-1480
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modeling target label dependencies is important for sequence labeling tasks. This may become crucial in the case of Spoken Language Understanding (SLU) applications, especially for the slot-filling task where models have to deal often with a high number of target labels. Conditional Random Fields (CRF) were previously considered as the most efficient algorithm in these conditions. More recently. different architectures of Recurrent Neural Networks (RNNs) have been proposed for the SLU slot-filling task. Most of them, however, have been successfully evaluated on the simple ATIS database, on which it is difficult to draw significant conclusions. In this paper we propose new variants of RNNs able to learn efficiently and effectively label dependencies by integrating label embeddings. We show first that modeling label dependencies is useless on the (simple) ATIS database and unstructured models can produce state-of-the-art results on this benchmark. On ATIS our new variants achieve the same results as state-of-the-art models, while being much simpler. On the other hand, on the MEDIA benchmark, we show that the modification introduced in the proposed RNN outperforms traditional RNNs and CRF models.
引用
收藏
页码:2491 / 2495
页数:5
相关论文
共 28 条
[1]  
[Anonymous], 2016, P NAACL HLT
[2]  
[Anonymous], P 17 INT C INT TEXT
[3]  
[Anonymous], 2016, IMPROVING RECURRENT
[4]  
[Anonymous], P LANG RES EV C MARR
[5]  
[Anonymous], 2013, Interspeech
[6]  
[Anonymous], 2015, INTERSPEECH
[7]  
[Anonymous], ITALIAN J COMPUTATIO
[8]  
[Anonymous], 2012, PRACTICAL RECOMMENDA
[9]  
[Anonymous], 2013, INTERSPEECH 2013
[10]  
[Anonymous], INT C LEARN REPR