Selective visual attention during public speaking in an immersive context

被引:0
作者
Mikael Rubin
Sihang Guo
Karl Muller
Ruohan Zhang
Michael J. Telch
Mary M. Hayhoe
机构
[1] The University of Texas at Austin,Department of Psychology and Institute for Mental Health Research
[2] The University of Texas at Austin,Center for Perceptual Systems
来源
Attention, Perception, & Psychophysics | 2022年 / 84卷
关键词
Virtual reality; Social cues; Eye tracking; 360° video; Public speaking;
D O I
暂无
中图分类号
学科分类号
摘要
It has recently become feasible to study selective visual attention to social cues in increasingly ecologically valid ways. In this secondary analysis, we examined gaze behavior in response to the actions of others in a social context. Participants (N = 84) were asked to give a 5-minute speech to a five-member audience that had been filmed in 360° video, displayed in a virtual reality headset containing a built-in eye tracker. Audience members were coached to make movements that would indicate interest or lack of interest (e.g., nodding vs. looking away). The goal of this paper was to analyze whether these actions influenced the speaker’s gaze. We found that participants showed reliable evidence of gaze towards audience member actions in general, and towards audience member actions involving their phone specifically (compared with other actions like looking away or leaning back). However, there were no differences in gaze towards actions reflecting interest (like nodding) compared with actions reflecting lack of interest (like looking away). Participants were more likely to look away from audience member actions as well, but there were no specific actions that elicited looking away more or less. Taken together, these findings suggest that the actions of audience members are broadly influential in motivating gaze behaviors in a realistic, contextually embedded (public speaking) setting. Further research is needed to examine the ways in which these findings can be elucidated in more controlled laboratory environments as well as in the real world.
引用
收藏
页码:396 / 407
页数:11
相关论文
共 50 条
  • [1] Selective visual attention during public speaking in an immersive context
    Rubin, Mikael
    Guo, Sihang
    Muller, Karl
    Zhang, Ruohan
    Telch, Michael J.
    Hayhoe, Mary M.
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2022, 84 (02) : 396 - 407
  • [2] Comparing Eye Tracking and Head Tracking During a Visual Attention Task in Immersive Virtual Reality
    Llanes-Jurado, Jose
    Marin-Morales, Javier
    Moghaddasi, Masoud
    Khatri, Jaikishan
    Guixeres, Jaime
    Alcaniz, Mariano
    HUMAN-COMPUTER INTERACTION: INTERACTION TECHNIQUES AND NOVEL APPLICATIONS, HCII 2021, PT II, 2021, 12763 : 32 - 43
  • [3] Infants' Selective Visual Attention Is Dependent on Maternal Affect and Emotional Context
    Juvrud, Joshua
    Haas, Sara A.
    Fox, Nathan A.
    Gredeback, Gustaf
    FRONTIERS IN PSYCHOLOGY, 2021, 12
  • [4] A Markovian Algorithm for Creating Immersive Public-Speaking Audiences
    Thomas, Nicklaus
    Evans, David
    Russ, Samuel
    2013 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE), 2013, : 382 - 383
  • [5] Hyperscanning and avoidance in social anxiety disorder: The visual scanpath during public speaking
    Chen, Nigel Teik Ming
    Thomas, Laurenn Maree
    Clarke, Patrick Joseph Fraser
    Hickie, Ian Bernard
    Guastella, Adam John
    PSYCHIATRY RESEARCH, 2015, 225 (03) : 667 - 672
  • [6] Selective Visual Attention during Mirror Exposure in Anorexia and Bulimia Nervosa
    Tuschen-Caffier, Brunna
    Bender, Caroline
    Caffier, Detlef
    Klenner, Katharina
    Braks, Karsten
    Svaldi, Jennifer
    PLOS ONE, 2015, 10 (12):
  • [7] Technology acceptance model: investigating students' intentions toward adoption of immersive 360° videos for public speaking rehearsals
    Vallade, Jessalyn I.
    Kaufmann, Renee
    Frisby, Brandi N.
    Martin, Joe C.
    COMMUNICATION EDUCATION, 2021, 70 (02) : 127 - 145
  • [8] Effect of a Full Immersive Virtual Reality Intervention on Selective Attention in Children
    Amprasi, Evaggelia
    Vernadakis, Nikolaos
    Zetou, Eleni
    Antoniou, Panagiotis
    INTERNATIONAL JOURNAL OF INSTRUCTION, 2022, 15 (01) : 565 - 582
  • [9] A selective attention model for predicting visual attractors
    Dinet, Eric
    Kubicki, Emmanuel
    2008 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-12, 2008, : 697 - 700
  • [10] Virtual Reality Sickness Reduces Attention During Immersive Experiences
    Mimnaugh, Katherine J.
    Center, Evan G.
    Suomalainen, Markku
    Becerra, Israel
    Lozano, Eliezer
    Murrieta-Cid, Rafael
    Ojala, Timo
    Lavalle, Steven M.
    Federmeier, Kara D.
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2023, 29 (11) : 4394 - 4404