Exploiting Object-of-Interest Information to Understand Attention in VR Classrooms

被引:33
作者
Bozkir, Efe [1 ]
Stark, Philipp [2 ]
Gao, Hong [1 ]
Hasenbein, Lisa [3 ]
Hahn, Jens-Uwe [4 ]
Kasneci, Enkelejda [1 ]
Goellner, Richard [3 ]
机构
[1] Univ Tubingen, Human Comp Interact, Tubingen, Germany
[2] Univ Tubingen, Human Comp Interact, Hector Res Inst Educ Sci & Psychol, Tubingen, Germany
[3] Univ Tubingen, Hector Res Inst Educ Sci & Psychol, Tubingen, Germany
[4] Hsch Med Stuttgart, Stuttgart, Germany
来源
2021 IEEE VIRTUAL REALITY AND 3D USER INTERFACES (VR) | 2021年
关键词
Human-centered computing; Human computer interaction (HCI); Empirical studies in HCI-; Computing methodologies; Computer graphics; Graphics systems and interfaces; Virtual reality; Applied computing; Education; Interactive learning environments; Computer-assisted instruction-; IMMERSIVE VIRTUAL-REALITY; STUDENTS; SCIENCES; LEARN;
D O I
10.1109/VR50410.2021.00085
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Recent developments in computer graphics and hardware technology enable easy access to virtual reality headsets along with integrated eye trackers, leading to mass usage of such devices. The immersive experience provided by virtual reality and the possibility to control environmental factors in virtual setups may soon help to create realistic digital alternatives to conventional classrooms. The importance of such settings has become especially evident during the COVID-19 pandemic, forcing many schools and universities to provide the digital teaching. Researchers foresee that such transformations will continue in the future with virtual worlds becoming an integral part of education. Until now, however, students' behaviors in immersive virtual environments have not been investigated in depth. In this work, we study students' attention by exploiting object-of-interests using eye tracking in different classroom manipulations. More specifically, we varied sitting positions of students, visualization styles of virtual avatars, and hand-raising percentages of peer-learners. Our empirical evidence shows that such manipulations play an important role in students' attention towards virtual peer-learners, instructors, and lecture material. This research may contribute to understanding of how visual attention relates to social dynamics in the virtual classroom, including significant considerations for the design of virtual learning spaces.
引用
收藏
页码:597 / 605
页数:9
相关论文
共 48 条
[1]   360-degree Video Gaze Behaviour: A Ground-Truth Data Set and a Classification Algorithm for Eye Movements [J].
Agtzidis, Ioannis ;
Startsev, Mikhail ;
Dorr, Michael .
PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, :1007-1015
[2]   Virtual reality systems enhance students' achievements in engineering education [J].
Alhalabi, Wadee S. .
BEHAVIOUR & INFORMATION TECHNOLOGY, 2016, 35 (11) :919-925
[3]   Saccade Landing Position Prediction for Gaze-Contingent Rendering [J].
Arabadzhiyska, Elena ;
Tursun, Okan Tarhan ;
Myszkowski, Karol ;
Seidel, Hans-Peter ;
Didyk, Piotr .
ACM TRANSACTIONS ON GRAPHICS, 2017, 36 (04)
[4]   The use of immersive virtual reality in the learning sciences: Digital transformations of teachers, students, and social context [J].
Bailenson, Jeremy N. ;
Yee, Nick ;
Blascovich, Jim ;
Beall, Andrew C. ;
Lundblad, Nicole ;
Jin, Michael .
JOURNAL OF THE LEARNING SCIENCES, 2008, 17 (01) :102-141
[5]   Do students learn better when seated close to the teacher? A virtual classroom study considering individual levels of inattention and hyperactivity-impulsivity [J].
Blume, Friederike ;
Goellner, Richard ;
Moeller, Korbinian ;
Dresler, Thomas ;
Ehlis, Ann-Christine ;
Gawrilow, Caterina .
LEARNING AND INSTRUCTION, 2019, 61 :138-147
[6]  
Bozkir E., 2020, DIFFERENTIAL PRIVACY
[7]   Assessment of Driver Attention during a Safety Critical Situation in VR to Generate VR-based Training [J].
Bozkir, Efe ;
Geisler, David ;
Kasneci, Enkelejda .
ACM CONFERENCE ON APPLIED PERCEPTION (SAP 2019), 2019,
[8]   Privacy Preserving Gaze Estimation using Synthetic Images via a Randomized Encoding Based Framework [J].
Bozkir, Efe ;
Uenal, Ali Burak ;
Akguen, Mete ;
Kasneci, Enkelejda ;
Pfeifer, Nico .
ETRA 2020 SHORT PAPERS: ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2020,
[9]  
Casu A., 2015, RIFTART BRINGING MAS, P77, DOI [10.2312/stag.20151294, DOI 10.2312/STAG.20151294, https://doi.org/10.2312/stag.20151294]
[10]   Privacy-Preserving Eye Videos using Rubber Sheet Model [J].
Chaudhary, Aayush K. ;
Pelz, Jeff B. .
ETRA 2020 SHORT PAPERS: ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2020,