Research of Sentiment Analysis based on Long-Sequence-Term-Memory Model

被引:1
作者
Yin, Fulian [1 ]
Pan, Xingyi [1 ]
He, Xiating [1 ]
Xu, Rongge [1 ]
机构
[1] Commun Univ China, Coll Informat Engn, Beijing, Peoples R China
来源
2018 3RD INTERNATIONAL CONFERENCE ON MECHANICAL, CONTROL AND COMPUTER ENGINEERING (ICMCCE) | 2018年
关键词
sentiment analysis; embedding word; Long-Sequence-Term-Memory model; natural language processing; machine learning;
D O I
10.1109/ICMCCE.2018.00109
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A method based on Long-Sequence-Term-Memory model and vector embedding to analyze sentiments of the online reviews is proposed in this paper. In order to obtain the vector representation from sentence level, it uses the extraction methods of LSTM, which is based on non-liner learning, to extend the vector embedding to sentence representation, and to achieve sentence embedding ultimately. The experimental results proved the high accuracy of this method, with which could live up to 91.35% when classifying the sentiments of online reviews. The ability to be applied to variety languages and the strong scalability of large scale corpus are two of its advantages, meanwhile, feature extraction of bigram can also improve its test accuracy. As the experimental results showed that the sentiment analysis method based on the LSTM model and the principle of embedding word is a highly effective method of sentiment analysis, and it has the strong scalability and can be applied to comments of different languages.
引用
收藏
页码:488 / 493
页数:6
相关论文
共 16 条
  • [1] LEARNING LONG-TERM DEPENDENCIES WITH GRADIENT DESCENT IS DIFFICULT
    BENGIO, Y
    SIMARD, P
    FRASCONI, P
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02): : 157 - 166
  • [2] Bengio Y, 2001, ADV NEUR IN, V13, P932
  • [3] Collobert R., 2008, P 25 ICML, P160, DOI [DOI 10.1145/1390156.1390177, 10.1145/1390156.1390177]
  • [4] Learning precise timing with LSTM recurrent networks
    Gers, FA
    Schraudolph, NN
    Schmidhuber, J
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (01) : 115 - 143
  • [5] Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.8.1735, 10.1007/978-3-642-24797-2, 10.1162/neco.1997.9.1.1]
  • [6] Hochreiter S, 2001, Gradient flow in recurrent nets: the difficulty of learning long-term dependencies, P237, DOI [10.1109/9780470544037.ch14, DOI 10.1109/9780470544037.CH14]
  • [7] Le Q., 2014, DISTRIBUTED REPRESEN, DOI DOI 10.1145/2740908.2742760
  • [8] Liu Y, 2015, AAAI CONF ARTIF INTE, P2418
  • [9] Mikolov T., 2013, EFFICIENT ESTIMATION
  • [10] Mikolov T., 2013, Adv Neural Inf Process Syst, P26, DOI DOI 10.48550/ARXIV.1310.4546