Enhanced Harris Hawks optimization as a feature selection for the prediction of student performance

被引:0
作者
Hamza Turabieh
Sana Al Azwari
Mahmoud Rokaya
Wael Alosaimi
Abdullah Alharbi
Wajdi Alhakami
Mrim Alnfiai
机构
[1] Taif University,Department of Information Technology, College of Computers and Information Technology
来源
Computing | 2021年 / 103卷
关键词
Harris Hawks optimization; Student performance; Population diversity; Educational data mining; 68T20; 68T05;
D O I
暂无
中图分类号
学科分类号
摘要
Predicting student performance for educational organizations such as universities, community colleges, schools, and training centers will enhance the overall results of these organizations. Big data can be extracted from the internal systems of these organizations, such as exam records, statistics about virtual courses, and e-learning systems. Finding meaningful knowledge from extracted data is a challenging task. In this paper, we proposed a modified version of Harris Hawks Optimization (HHO) algorithm by controlling the population diversity to overcome the early convergence problem and prevent trapping in a local optimum. The proposed approach is employed as a feature selection algorithm to discover the most valuable features for student performance prediction problem. A dynamic controller that controls the population diversity by observing the performance of HHO using the k-nearest neighbors (kNN) algorithm as a clustering approach. Once all solutions belong to one cluster, an injection process is employed to redistribute the solutions over the search space. A set of machine learning classifiers such as kNN, Layered recurrent neural network (LRNN), Naïve Bayes, and Artificial Neural Network are used to evaluate the overall prediction system. A real dataset obtained from UCI machine learning repository is adopted in this paper. The obtained results show the importance of predicting students’ performance at an earlier stage to avoid students’ failure and improve the overall performance of the educational organization. Moreover, the reported results show that the combination between the enhanced HHO and LRNN can outperform other classifiers with accuracy equal to 92%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$92\%$$\end{document}, since LRNN is a deep learning algorithm that is able to learn from previous and current input values.
引用
收藏
页码:1417 / 1438
页数:21
相关论文
共 75 条
[1]  
Kemper L(2020)Predicting student dropout: a machine learning approach Eur J Higher Educ 10 28-47
[2]  
Vorhoff G(2019)Educational data mining: Predictive analysis of academic performance of public school students in the capital of Brazil J Bus Res 94 335-343
[3]  
Wigger BU(2018)Educational data mining applications and tasks: a survey of the last 10 years Educ Inf Technol 23 537-553
[4]  
Fernandes E(2019)Educational data mining and learning analytics for 21st century higher education: a review and synthesis Telematics Inf 37 13-49
[5]  
Holanda M(1986)Future paths for integer programming and links to artificial intelligence Comput Oper Res 13 533-549
[6]  
Victorino M(2016)The whale optimization algorithm Adv Eng Softw 95 51-67
[7]  
Borges V(2015)Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm Knowl Based Syst 89 228-249
[8]  
Carvalho R(2016)Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems Neural Comput Appl 27 1053-1073
[9]  
Erven GV(2019)Harris hawks optimization: algorithm and applications Future Gener Comput Syst. 97 849-872
[10]  
Bakhshinategh B(2002)Simple explanation of the no-free-lunch theorem and its implications J Optim Theory Appl 115 549-570