MLP emulation of N-gram models as a first step to connectionist language modeling

被引:0
|
作者
Castro, MJ [1 ]
Prat, F [1 ]
Casacuberta, F [1 ]
机构
[1] Univ Politecn Valencia, Dept Sistemes Informat & Computacio, Valencia, Spain
来源
NINTH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS (ICANN99), VOLS 1 AND 2 | 1999年 / 470期
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In problems such as automatic speech recognition and machine translation, where the system response must be a sentence in a given language, language models are employed in order to improve system performance. These language models are usually N-gram models (for instance, bigram or trigram models) which are estimated from large text databases using the occurrence frequencies of these N-grams. In 1989, Nakamura and Shikano empirically showed how multilayer perceptrons can emulate trigram model predictive capabilities with additional generalization features. Our paper discusses Nakamura and Shikano's work, provides new empirical evidence on multilayer perceptron capability to emulate N-gram models, and proposes new directions for extending neural network-based language models. The experimental work we present here compares connectionist phonological bigram models with a conventional one using different measures, which include recognition performances in a Spanish acoustic-phonetic decoding task.
引用
收藏
页码:910 / 915
页数:6
相关论文
共 50 条
  • [31] Language modeling by string pattern N-gram for Japanese speech recognition
    Ito, A
    Kohda, M
    ICSLP 96 - FOURTH INTERNATIONAL CONFERENCE ON SPOKEN LANGUAGE PROCESSING, PROCEEDINGS, VOLS 1-4, 1996, : 490 - 493
  • [32] Stepwise API usage assistance using n-gram language models
    Santos, Andre L.
    Prendi, Goncalo
    Sousa, Hugo
    Ribeiro, Ricardo
    JOURNAL OF SYSTEMS AND SOFTWARE, 2017, 131 : 461 - 474
  • [33] N-gram modeling based on recognized phonemes in automatic language identification
    Kwan, H
    Hirose, K
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 1998, E81D (11) : 1224 - 1231
  • [34] N-gram Language Models in JLASER Neural Network Speech Recognizer
    Konopik, Miloslav
    Habernal, Ivan
    Brychcin, Tomas
    2010 INTERNATIONAL CONFERENCE ON APPLIED ELECTRONICS, 2010, : 167 - 170
  • [35] Task adaptation using MAP estimation in N-gram language modeling
    Masataki, H
    Sagisaka, Y
    Hisaki, K
    Kawahara, T
    1997 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS I - V: VOL I: PLENARY, EXPERT SUMMARIES, SPECIAL, AUDIO, UNDERWATER ACOUSTICS, VLSI; VOL II: SPEECH PROCESSING; VOL III: SPEECH PROCESSING, DIGITAL SIGNAL PROCESSING; VOL IV: MULTIDIMENSIONAL SIGNAL PROCESSING, NEURAL NETWORKS - VOL V: STATISTICAL SIGNAL AND ARRAY PROCESSING, APPLICATIONS, 1997, : 783 - 786
  • [36] Investigation on LSTM Recurrent N-gram Language Models for Speech Recognition
    Tueske, Zoltan
    Schlueter, Ralf
    Ney, Hermann
    19TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2018), VOLS 1-6: SPEECH RESEARCH FOR EMERGING MARKETS IN MULTILINGUAL SOCIETIES, 2018, : 3358 - 3362
  • [37] PERFORMANCE ANALYSIS OF NEURAL NETWORKS IN COMBINATION WITH N-GRAM LANGUAGE MODELS
    Oparin, Ilya
    Sundermeyer, Martin
    Ney, Hermann
    Gauvain, Jean-Luc
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 5005 - 5008
  • [38] Profiling Household Appliance Electricity Usage with N-Gram Language Modeling
    Li, Daoyuan
    Bissyande, Tegawende F.
    Kubler, Sylvain
    Klein, Jacques
    Le Traon, Yves
    PROCEEDINGS 2016 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY (ICIT), 2016, : 604 - 609
  • [39] Factored bilingual n-gram language models for statistical machine translation
    Crego, Josep M.
    Yvon, Francois
    MACHINE TRANSLATION, 2010, 24 (02) : 159 - 175
  • [40] Combining naive Bayes and n-gram language models for text classification
    Peng, FC
    Schuurmans, D
    ADVANCES IN INFORMATION RETRIEVAL, 2003, 2633 : 335 - 350