Sentence subjectivity analysis of a political and ideological debate dataset using LSTM and BiLSTM with attention and GRU models

被引:23
作者
Al Hamoud, Ahmed [1 ,3 ]
Hoenig, Amber [1 ]
Roy, Kaushik [2 ,3 ]
机构
[1] North Carolina A&T State Univ, Dept Computat Data Sci & Engn, Greensboro, NC 27411 USA
[2] North Carolina A&T State Univ, Dept Comp Sci, Greensboro, NC 27411 USA
[3] North Carolina Agr & Tech State Univ, 1601 Market St, Greensboro, NC 27411 USA
关键词
Subjectivity analysis; Deep learning; Long Short -Term Memory (LSTM); Gated Recurrent Unit (GRU); LSTM with attention; Attention network; CLASSIFICATION; TEXT;
D O I
10.1016/j.jksuci.2022.07.014
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Subjectivity analysis is one of the key tasks in the field of natural language processing. Used to annotate data as subjective or objective, subjectivity analysis can be implemented on its own or as a precursor to other NLP applications such as sentiment analysis, emotion analysis, consumer review analysis, political opinion analysis, document summarization, and question answering systems. The main objective of this article is to test and compare six deep learning methods for subjectivity classification, including Long Short-Term Memory Networks (LSTM), Gated Recurrent Units (GRU), bidirectional GRU, bidirectional LSTM, LSTM with attention, and bidirectional LSTM with attention. We introduced a combination method for subjectivity annotation using lexicon-based and syntactic pattern-based methods. We evaluated the performance of GloVe versus one-hot encoding. We also reformatted, preprocessed, and annotated a political and ideological debate dataset for use in subjectivity analysis. Our research compares favorably with the performance of existing research on subjectivity analysis, achieving very high accuracy and evaluation metrics. LSTM with attention performed the best out of all the methods we tested with an accuracy of 97.39%. (c) 2022 The Authors. Published by Elsevier B.V. on behalf of King Saud University. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
引用
收藏
页码:7974 / 7987
页数:14
相关论文
共 80 条
[21]  
Chung JY, 2014, Arxiv, DOI [arXiv:1412.3555, 10.48550/arXiv.1412.3555]
[22]  
Davis J., 2006, P 23 INT C MACH LEAR, V148, P233, DOI 10.1145/1143844.1143874
[23]  
de Kunder M, 2022, SIZE WORLD WIDE WEB
[24]  
Deng L., 2018, Deep learning in natural language processing
[25]  
Deng L., 2015, Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, P1323, DOI DOI 10.3115/V1/N15-1146
[26]  
dos Santos C., 2014, P COLING 2014 25 INT, P69
[27]  
Dyer C, 2015, Arxiv, DOI arXiv:1505.08075
[28]   A novel adaptable approach for sentiment analysis on big social data [J].
El Alaoui, Imane ;
Gahi, Youssef ;
Messoussi, Rochdi ;
Chaabi, Youness ;
Todoskoff, Alexis ;
Kobi, Abdessamad .
JOURNAL OF BIG DATA, 2018, 5 (01)
[29]   Attention in Natural Language Processing [J].
Galassi, Andrea ;
Lippi, Marco ;
Torroni, Paolo .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (10) :4291-4308
[30]  
Goodfellow I, 2016, ADAPT COMPUT MACH LE, P1