The effect of instructions on multiple-choice test scores

被引:12
|
作者
Prieto, G [1 ]
Delgado, AR [1 ]
机构
[1] Univ Salamanca, E-37008 Salamanca, Spain
关键词
multiple-choice testing method; guessing; scoring testing; reliability;
D O I
10.1027//1015-5759.15.2.143
中图分类号
B849 [应用心理学];
学科分类号
040203 ;
摘要
Most standardized tests instruct subjects to guess under scoring procedures that do not correct for guessing or correct only for expected random guessing. Other scoring rules, such as offering a small reward for omissions or punishing errors by discounting more than expected from random guessing, have been proposed. This study was designed to test the effects of these four instruction/scoring conditions on performance indicators and on score reliability of multiple-choice tests. Some 240 participants were randomly assigned to four conditions differing in how much they discourage guessing. Subjects performed two psychometric computerized tests, which differed only in the instructions provided and the associated scoring procedure. For both tests, our hypotheses predicted (0) an increasing trend in omissions (showing that instructions were effective); (1) decreasing trends in wrong and right responses; (2) an increase in reliability estimates of both number right and scores. Predictions regarding performance indicators were mostly fulfilled, but expected differences in reliability failed to appear. The discussion of results takes into account not only psychometric issues related to guessing, but also the misleading educational implications of recommendations to guess in testing contexts.
引用
收藏
页码:143 / 150
页数:8
相关论文
共 50 条
  • [31] Partial Credit in Answer-Until-Correct Multiple-Choice Tests Deployed in a Classroom Setting
    Slepkov, Aaron D.
    Godfrey, Alan T. K.
    APPLIED MEASUREMENT IN EDUCATION, 2019, 32 (02) : 138 - 150
  • [32] Development and use of a multiple-choice item writing flaws evaluation instrument in the context of general chemistry
    Breakall, Jared
    Randles, Christopher
    Tasker, Roy
    CHEMISTRY EDUCATION RESEARCH AND PRACTICE, 2019, 20 (02) : 369 - 382
  • [33] Comparison of Electronic Examinations using Adaptive Multiple-choice Questions and Constructed-response Questions
    Stavroulakis, Peter J.
    Photopoulos, Panagiotis
    Ventouras, Errikos
    Triantis, Dimos
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED EDUCATION (CSEDU), VOL 1, 2020, : 358 - 365
  • [34] Pushing Critical Thinking Skills With Multiple-Choice Questions: Does Bloom's Taxonomy Work?
    Zaidi, Nikki L. Bibler
    Grob, Karri L.
    Monrad, Seetha M.
    Kurtz, Joshua B.
    Tai, Andrew
    Ahmed, Asra Z.
    Gruppen, Larry D.
    Santen, Sally A.
    ACADEMIC MEDICINE, 2018, 93 (06) : 856 - 859
  • [35] Online Cognitive Diagnostic Assessment with Ordered Multiple-Choice Items for Word Problems involving 'Time'
    Huan, Chin
    Meng, Chew Cheng
    EDUCATION AND INFORMATION TECHNOLOGIES, 2022, 27 (06) : 7721 - 7748
  • [36] Development and validation of online cognitive diagnostic assessment with ordered multiple-choice items for 'Multiplication of Time'
    Chin, Huan
    Chew, Cheng Meng
    Lim, Hooi Lian
    JOURNAL OF COMPUTERS IN EDUCATION, 2021, 8 (02) : 289 - 316
  • [37] Comparing Delay Discounting Rates When Using the Fill-in-the-Blank and Multiple-Choice Methods
    Weatherly, Jeffrey N.
    Derenne, Adam
    JOURNAL OF GENERAL PSYCHOLOGY, 2011, 138 (04) : 300 - 318
  • [38] Option weights should be determined empirically and not by experts when assessing knowledge with multiple-choice items
    Diedenhofen, Birk
    Musch, Jochen
    INTERNATIONAL JOURNAL OF SELECTION AND ASSESSMENT, 2019, 27 (03) : 256 - 266
  • [39] Item analysis: the impact of distractor efficiency on the difficulty index and discrimination power of multiple-choice items
    Rezigalla, Assad Ali
    Eleragi, Ali Mohammed Elhassan Seid Ahmed
    Elhussein, Amar Babikir
    Alfaifi, Jaber
    ALGhamdi, Mushabab A.
    Al Ameer, Ahmed Y.
    Yahia, Amar Ibrahim Omer
    Mohammed, Osama A.
    Adam, Masoud Ishag Elkhalifa
    BMC MEDICAL EDUCATION, 2024, 24 (01)
  • [40] Differences in Reaction to Immediate Feedback and Opportunity to Revise Answers for Multiple-Choice and Open-Ended Questions
    Attali, Yigal
    Laitusis, Cara
    Stone, Elizabeth
    EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2016, 76 (05) : 787 - 802