Comparison of Item analysis of Online and Offline courses in College of Medicine, Najran University

被引:0
作者
Huneif, Mohammed Ayed [1 ]
机构
[1] Najran Univ, Coll Med, Dept Pediat, Najran, Saudi Arabia
来源
PAKISTAN JOURNAL OF MEDICAL & HEALTH SCIENCES | 2020年 / 14卷 / 04期
关键词
online; manual; exams; tests; item analysis; FORMATIVE ASSESSMENT; HIGHER-EDUCATION; SUPPORT; TESTS;
D O I
暂无
中图分类号
R5 [内科学];
学科分类号
1002 ; 100201 ;
摘要
Today, information technology is everywhere and accessible to almost all. In this age of information technology, medical education is now faced with novel challenges. The online healthcare information boom, on the one hand constantly challenges medical students to update and extend their current body of knowledge quickly. The fast expansion of higher education online learning has also benefited from the possible cost savings for unlimited learners. Passively arranged into an experimental or control group were the undergraduates who took part in the included experiments. The main of this study is to compare online and offline exams. Methods: Data was collected from the examination department of college of Medicine, Najran University. Data was entered in the SPSS ver.20 for analysis. Descriptive and inferential statistics was obtained. (Mean, scores, frequencies and percentages), T test and chi-square tests were used to find out the significant differences and degree of associations between the scores and item analysis parameters. Results: Out of 6 tests, we did not observe the significant differences among the online and offline pass rat except Exam F (P<0.01), Easy questions are significantly increased in online exams. Conclusion: Online MCQs are seems to be more easier than paper-based assessments, but Authorized / validated Online MCQs would be more precise.
引用
收藏
页码:1493 / 1496
页数:4
相关论文
共 34 条
[1]  
Al-Qdah Majdi, 2017, International Journal of Information and Education Technology, V7, P106, DOI 10.18178/ijiet.2017.7.2.850
[2]  
Anderson T, 2016, ISS ONLINE EDUC, P35
[3]   Does regular online testing enhance student learning in the numerical sciences? Robust evidence from a large data set [J].
Angus, Simon D. ;
Watson, Judith .
BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY, 2009, 40 (02) :255-272
[4]  
[Anonymous], 2014, Irish Journal of Academic Practice
[5]  
[Anonymous], 2014, Guide to the Assessment Design Decisions Framework
[6]  
Arkoudis S., 2014, HERDSA Review of Higher Education, V1, P47
[7]   Cheating at online formative tests: Does it pay off? [J].
Arnold, Ivo J. M. .
INTERNET AND HIGHER EDUCATION, 2016, 29 :98-106
[8]  
Baleni ZG, 2015, ELECTRON J E-LEARN, V13, P228
[9]   Support for assessment practice: developing the Assessment Design Decisions Framework [J].
Bearman, Margaret ;
Dawson, Phillip ;
Boud, David ;
Bennett, Sue ;
Hall, Matt ;
Molloy, Elizabeth .
TEACHING IN HIGHER EDUCATION, 2016, 21 (05) :545-556
[10]   How technology shapes assessment design: Findings from a study of university teachers [J].
Bennett, Sue ;
Dawson, Phillip ;
Bearman, Margaret ;
Molloy, Elizabeth ;
Boud, David .
BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY, 2017, 48 (02) :672-682