Metacognitive Overload!: Positive and Negative Effects of Metacognitive Prompts in an Intelligent Tutoring System

被引:21
作者
McCarthy K.S. [1 ]
Likens A.D. [1 ]
Johnson A.M. [1 ]
Guerrero T.A. [1 ]
McNamara D.S. [1 ]
机构
[1] Arizona State University, Tempe, AZ
关键词
Intelligent tutoring systems; Log data; Metacognition; Reading comprehension;
D O I
10.1007/s40593-018-0164-5
中图分类号
学科分类号
摘要
Research suggests that promoting metacognitive awareness can increase performance in, and learning from, intelligent tutoring systems (ITSs). The current work examines the effects of two metacognitive prompts within iSTART, a reading comprehension strategy ITS in which students practice writing quality self-explanations. In addition to comparing iSTART practice to a no-training control, those in the iSTART condition (n = 116) were randomly assigned to a 2 (performance threshold: off, on) × 2(self-assessment: off, on) design. The performance threshold notified students when their average self-explanation score was below an experimenter-set threshold and the self-assessment prompted students to estimate their self-explanation score on the current trial. Students who practiced with iSTART had higher posttest self-explanation scores and inference comprehension scores on a transfer test than students in the no training control, replicating previous benefits for iSTART. However, there were no effects of either metacognitive prompt on these learning outcomes. In-system self-explanation scores indicated that the metacognitive prompts were detrimental to performance relative to standard iSTART practice. This study did not find benefits of metacognitive prompts in enhancing performance during practice or after the completion of training. Such findings support the idea that improving reading comprehension strategies comes from deliberate practice with actionable feedback rather than explicit metacognitive supports. © 2018, International Artificial Intelligence in Education Society.
引用
收藏
页码:420 / 438
页数:18
相关论文
共 49 条
  • [1] Azevedo R., Computer environments as metacognitive tools for enhancing learning, Educational Psychologist, 40, 4, pp. 193-197, (2005)
  • [2] Azevedo R., Hadwin A.F., Scaffolding self-regulated learning and metacognition–implications for the design of computer-based scaffolds, Instructional Science, 33, 5, pp. 367-379, (2005)
  • [3] Azevedo R., Martin S.A., Taub M., Mudrick N.V., Millar G.C., Grafsgaard J.F., Are pedagogical agents’ external regulation effective in fostering learning with intelligent tutoring systems?, International Conference on Intelligent Tutoring Systems, pp. 197-207, (2016)
  • [4] Baker L., Beall L., Metacognitive processes and reading comprehension, Handbook of research on reading comprehension, pp. 373-388, (2009)
  • [5] Bannert M., Hildebrand M., Mengelkamp C., Effects of a metacognitive support device in learning environments, Computers in Human Behavior, 25, 4, pp. 829-835, (2009)
  • [6] Bates D., Maechler M., Bolker B., Walker S., Fitting linear mixed-effects models using lme4, Journal of Statistical Software, 67, 1, pp. 1-48, (2015)
  • [7] Chi M.T.H., De Leeuw N., Chiu M., LaVancher C., Eliciting self-explanations improves understanding, Cognitive Science, 18, pp. 439-477, (1994)
  • [8] Devolder A., van Braak J., Tondeur J., Supporting self-regulated learning in computer-based learning environments: Systematic review of effects of scaffolding in the domain of science education, Journal of Computer Assisted Learning, 28, 6, pp. 557-573, (2012)
  • [9] Ericsson K.A., Krampe R.T., Tesch-Romer C., The role of deliberate practice in the acquisition of expert performance, Psychological Review, 100, 3, pp. 363-406, (1993)
  • [10] Falchikov N., Boud D., Student self-assessment in higher education: A meta-analysis, Review of Educational Research, 59, pp. 395-430, (1989)