CONSTRUCTED RESPONSE OR MULTIPLE-CHOICE QUESTIONS FOR ASSESSING DECLARATIVE PROGRAMMING KNOWLEDGE? THAT IS THE QUESTION!

被引:1
作者
Belo, Yolanda [1 ]
Moro, Sergio [2 ]
Martins, Antonio [2 ]
Ramos, Pedro [2 ,3 ]
Costa, Joana Martinho [2 ]
Esmerado, Joaquim [2 ]
机构
[1] Inst Univ Lisboa ISCTE IUL, Lisbon, Portugal
[2] Inst Univ Lisboa ISCTE IUL, ISTAR IUL, Lisbon, Portugal
[3] Inst Univ Lisboa ISCTE IUL, IT IUL, Lisbon, Portugal
关键词
constructed response; multiple-choice questions; educational data mining; support vector machine; neural networks; BOUNDARIES; STUDENTS;
D O I
10.28945/4479
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Aim/Purpose This paper presents a data mining approach for analyzing responses to advanced declarative programming questions. The goal of this research is to find a model that can explain the results obtained by students when they perform exams with Constructed Response questions and with equivalent Multiple-Choice Questions. Background The assessment of acquired knowledge is a fundamental role in the teaching-learning process. It helps to identify the factors that can contribute to the teacher in the developing of pedagogical methods and evaluation tools and it also contributes to the self-regulation process of learning. However, better format of questions to assess declarative programming knowledge is still a subject of ongoing debate. While some research advocates the use of constructed responses, others emphasize the potential of multiple-choice questions. Methodology A sensitivity analysis was applied to extract useful knowledge from the relevance of the characteristics (i.e., the input variables) used for the data mining process to compute the score. Contribution Such knowledge helps the teachers to decide which format they must consider with respect to the objectives and expected students results. Findings The results shown a set of factors that influence the discrepancy between answers in both formats. Recommendations for Practitioners Teachers can make an informed decision about whether to choose multiple-choice questions or constructed-response taking into account the results of this study. Recommendations for Researchers In this study a block of exams with CR questions is verified to complement the area of learning, returning greater performance in the evaluation of students and improving the teaching-learning process. Impact on Society The results of this research confirm the findings of several other researchers that the use of ICT and the application of MCQ is an added value in the evaluation process. In most cases the student is more likely to succeed with MCQ, however if the teacher prefers to evaluate with CR other research approaches are needed. Future Research Future research must include other question formats.
引用
收藏
页码:153 / 170
页数:18
相关论文
共 40 条
[1]  
Almeida D., 2017, THESIS
[2]  
[Anonymous], 2010, USE CONSTRUCTED RESP
[3]  
[Anonymous], 2008, USING DATA MINING PR
[4]  
Azevedo J. M., 2017, THESIS
[5]  
Baker R., 2010, INT ENCY ED, V7, P112, DOI DOI 10.1016/B978-0-08-044894-7.01318-X
[6]  
Baradwaj BK, 2011, INT J ADV COMPUT SC, V2, P63
[7]  
Bi J, 2003, Proccedings of the Twentieth International Conference on Machine Learning (ICML-2003), P43
[8]  
Biggs J, 1996, HIGH EDUC, V32, P347, DOI 10.1007/BF00138871
[9]  
BLOOM BenjaminS., 1976, Taxionomia de Objetivos Educacionais - Compendio Segundo: Dominio Afetivo
[10]  
Buchweitz B., 1975, REUN AN SBPC 26, P3