Exploring Optimality and Consistency of Supervised Machine Learning Algorithms in Sentiment Analysis

被引:1
作者
Ho, Chuk Fong [1 ]
Liew, Jessie [1 ]
Lim, Tong Ming [1 ]
机构
[1] Tunku Abdul Rahman Univ Management & Technol, Kuala Lumpur, Malaysia
来源
PROCEEDINGS OF THE 2024 9TH INTERNATIONAL CONFERENCE ON INTELLIGENT INFORMATION TECHNOLOGY, ICIIT 2024 | 2024年
关键词
Sentiment analysis; Supervised Machine Learning; Artificial Intelligence;
D O I
10.1145/3654522.3654531
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Over the past decade, deep learning has gained massive popularity in the field of machine learning (ML). However, it is more computationally expensive to train a deep learning model that requires the presence of a huge amount of labeled dataset that is limited in availability. Since a smaller dataset is needed to train a model with supervised ML algorithms, making it more feasible in practice, this study aims to investigate the performance of six widely used supervised ML algorithms which include Decision Trees, K-Nearest Neighbors, Logistic Regression, Naive Bayes, Random Forests, and Support Vector Machines in terms of optimality and consistency in the context of sentiment analysis. The findings of this study underscore the importance of adapting a high performing, consistent and predictable supervised ML algorithms to overcome language barriers for reliable sentiment analysis and show that both Logistic Regression and Support Vector Machines are deemed to fulfil these criteria.
引用
收藏
页码:48 / 54
页数:7
相关论文
共 43 条
  • [1] Afrina Aina., 2022, Journal of System and Management Sciences, V12, P50
  • [2] Aggarwal D., 2021, Journal of Information Technology Management, V13, P119
  • [3] Ahuja Ravinder, 2019, Procedia Computer Science, V152, P341, DOI 10.1016/j.procs.2019.05.008
  • [4] Barunaha Atimabh, Advances in Computer Science Research, V107, P26
  • [5] Biau G, 2016, TEST-SPAIN, V25, P197, DOI 10.1007/s11749-016-0481-7
  • [6] Chakarov A, 2016, Arxiv, DOI arXiv:1603.07292
  • [7] Copaceanu A.-M., 2021, "Ovidius" University Annals, Economic Sciences Series, VXXI, P261
  • [8] D'Andrea A., 2015, International Journal of Computer Applications, V125, P26, DOI [DOI 10.5120/IJCA2015905866, 10.5120/ijca2015905866]
  • [9] Using recursive feature elimination in random forest to account for correlated variables in high dimensional data
    Darst, Burcu F.
    Malecki, Kristen C.
    Engelman, Corinne D.
    [J]. BMC GENETICS, 2018, 19
  • [10] Das A., 2021, Encyclopedia of Quality of Life and Well-Being Research, P1, DOI DOI 10.1007/978-3-319-69909-71689-2