Assessment of multiple choice question exams quality using graphical methods

被引:0
作者
Yousuf, Mustafa S. [1 ]
Miles, Katherine [1 ]
Harvey, Heather [2 ]
Al-Tamimi, Mohammad [1 ]
Badran, Darwish [3 ]
机构
[1] Hashemite Univ, Zarqa, Jordan
[2] Limestone Univ, Gaffney, SC USA
[3] Univ Jordan, Amman, Jordan
关键词
control charts; exam analysis; medical exams; receiver operator characteristic curve; SIGNAL DETECTABILITY; EDUCATION; DIFFICULTY; STUDENTS; TABLES; TESTS; AREA; TEXT;
D O I
暂无
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Exams should be valid, reliable, and discriminative. Multiple informative methods are used for exam analysis. Displaying analysis results numerically, however, may not be easily comprehended. Using graphical analysis tools could be better for the perception of analysis results. Two such methods were employed: standardized x-bar control charts with standard error of measurement as control limits and receiver operator characteristic curves. Exams of two medical classes were analyzed. For each exam, the mean, standard deviation, reliability, and standard error of measurement were calculated. The means were standardized and plotted against the reference lines of the control chart. The means were chosen as cut-off points to calculate sensitivity and specificity. The receiver operator characteristic curve was plotted and area under the curve determined. Standardized control charts allowed clear, simultaneous comparison of multiple exams. Calculating the control limits from the standard error of measurement created acceptable limits of variability in which the standard deviation and reliability were incorporated. The receiver operator characteristic curve graphically showed the discriminative power of the exam. Observations made with the graphical and classical methods were consistent. Using graphical methods to analyse exams could make their interpretation more accessible and the identification of exams that required further investigation easier. Practitioner Notes 1. Exams should be valid, reliable, and discriminative 2. Classical methods to analyze exam quality represent data numerically 3. Numerical representation of data may not be readily understood by department staff 4. Graphical methods to analyze exam represent data in easy-to-understand charts 5. Control charts and receiver operator characteristic curves can be employed for such purposes
引用
收藏
页数:19
相关论文
共 50 条
  • [1] ASSESS THE ASSESSMENT: AN AUTOMATED TOOL FOR ANALYZING MULTIPLE CHOICE EXAMS
    Nettekoven, Michaela
    Ledermueller, Karl
    2011 4TH INTERNATIONAL CONFERENCE OF EDUCATION, RESEARCH AND INNOVATION (ICERI), 2011, : 2564 - 2571
  • [2] Formative and Summative Automated Assessment with Multiple-Choice Question Banks
    Beerepoot, Maarten T. P.
    JOURNAL OF CHEMICAL EDUCATION, 2023, 100 (08) : 2947 - 2955
  • [3] STUDENT'S RISK PERCEPTIONS AND PERFORMANCE IN THE MULTIPLE CHOICE EXAMS
    Carrasco Hernandez, Antonio J.
    Lucas Perez, M. Encarnacion
    Rubio Banon, Alicia
    Sanchez Marin, Gregorio
    INTED2014: 8TH INTERNATIONAL TECHNOLOGY, EDUCATION AND DEVELOPMENT CONFERENCE, 2014, : 2085 - 2091
  • [4] The Testing Methods and Gender Differences in Multiple-Choice Assessment
    Ng, Annie W. Y.
    Chan, Alan H. S.
    IAENG TRANSACTIONS ON ENGINEERING TECHNOLOGIES, VOL 3, 2009, 1174 : 236 - +
  • [5] Activating prior knowledge using multiple-choice question distractors
    Schimmelfing, Liza C.
    Persky, Adam M.
    MEDICAL EDUCATION, 2020, 54 (10) : 925 - 931
  • [6] Better rank assignment in multiple-choice entrance exams
    Chatterjee, Anindya
    CURRENT SCIENCE, 2013, 105 (02): : 193 - 200
  • [7] Multiple-Choice Exams: An Obstacle for Higher-Level Thinking in Introductory Science Classes
    Stanger-Hall, Kathrin F.
    CBE-LIFE SCIENCES EDUCATION, 2012, 11 (03): : 294 - 306
  • [8] Weighted Scoring of Multiple-choice Questions based Exams: Expert and Empirical Weighting Factors
    Photopoulos, Panagiotis
    Tsakiridis, Odysseus
    Stavrakas, Ilias
    Triantis, Dimos
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED EDUCATION (CSEDU), VOL 1, 2020, : 382 - 387
  • [9] Partial Identification of Answer Reviewing Effects in Multiple-Choice Exams
    Kim, Yongnam
    JOURNAL OF EDUCATIONAL MEASUREMENT, 2020, 57 (04) : 511 - 526
  • [10] Leaf: Multiple-Choice Question Generation
    Vachev, Kristiyan
    Hardalov, Momchil
    Karadzhov, Georgi
    Georgiev, Georgi
    Koychev, Ivan
    Nakov, Preslav
    ADVANCES IN INFORMATION RETRIEVAL, PT II, 2022, 13186 : 321 - 328