Segmented Analysis of Eye Gaze Behaviors of Fluent and Stuttered Speech

被引:0
|
作者
Hudock, Daniel [1 ]
Stuart, Andrew [2 ]
Saltuklaroglu, Tim [3 ]
Zhang, Jianliang [4 ]
Murray, Nicholas [5 ]
Kalinowski, Joseph [2 ]
Altieri, Nicholas [1 ]
机构
[1] Idaho State Univ, Dept Commun Sci & Disorders, Div Hlth Sci, Pocatello, ID 83209 USA
[2] E Carolina Univ, Coll Allied Hlth Sci, Dept Commun Sci & Disorders, Greenville, NC USA
[3] Univ Tennessee, Dept Audiol & Speech Pathol, Knoxville, TN USA
[4] N Carolina Cent Univ, Dept Allied Profess, Sch Educ, Durham, NC USA
[5] E Carolina Univ, Dept Kinesiol, Coll Hlth & Human Performance, Greenville, NC USA
来源
CANADIAN JOURNAL OF SPEECH-LANGUAGE PATHOLOGY AND AUDIOLOGY | 2015年 / 39卷 / 02期
关键词
STUTTER; STAMMER; FLUENCY; FLUENCY DISORDERS; COMMUNICATION; PERCEPTION; EYE GAZE; EYE TRACKING;
D O I
暂无
中图分类号
R49 [康复医学];
学科分类号
100215 ;
摘要
Purpose: To measure the effect of stuttering on eye-gaze in fluent speakers while viewing video presentations of typical speakers and people who stutter (PWS) speaking because eye-gaze behaviors provide indicators of emotion and communicative integrity. Method: Sixteen fluent college-age adults, naive to stuttering, observed six 30-second audiovisual speech samples of three PWS, and three age and gender matched controls who do not stutter (PWNS). A desk-mounted eye-tracker recorded the amount of time participants spent watching four regions of interest (ROIs) in the stimulus videos of PWS and PWNS: eyes, nose, mouth, and "outside" (i.e., any gaze-point not occurring within the eyes, nose, or mouth area). Proportions of gaze-time in each ROI were the dependent variables of interest in the study. Comparisons were made between proportions of time spent in each ROI for the PWS and PWNS speaker groups, and also between fluent versus disfluent speech segments produced by the PWS. Results: Participants spent significantly more time watching the eyes (e.g., maintaining eye-contact) when viewing PWNS than PWS. They also spent significantly more time observing mouth regions of PWS. When watching the videos of PWS, participants spent significantly more time observing nose and mouth regions when speech was stuttered (PWS-S) than when the speech was fluent (PWS-F). Conclusions: Overall, the difference in eye gaze patterns across speaker-group is interpreted to indicate negative emotional responses to stuttering. Current findings align with previous research showing that stuttered speech elicits negative reactions from listeners. Specifically, stuttering behaviors avert gaze from the eyes. Gaze aversion is a clear sign of disrupted communication that is visible to PWS and may contribute to their negative reactions to their own stuttering.
引用
收藏
页码:134 / 145
页数:12
相关论文
共 50 条
  • [21] Creating a response space in multiparty classroom settings for students using eye-gaze accessed speech-generating devices
    Tegler, Helena
    Demmelmaier, Ingrid
    Johansson, Monica Blom
    Noren, Niklas
    AUGMENTATIVE AND ALTERNATIVE COMMUNICATION, 2020, 36 (04) : 203 - 213
  • [22] The Eye of the Typer: A Benchmark and Analysis of Gaze Behavior during Typing
    Papoutsaki, Alexandra
    Gokaslan, Aaron
    Tompkin, James
    He, Yuze
    Huang, Jeff
    2018 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2018), 2018,
  • [24] Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze Pointing
    Pandey L.
    Arif A.S.
    Proceedings of the ACM on Human-Computer Interaction, 2022, 6 (ISS): : 328 - 353
  • [25] The Analysis and Prediction of Eye Gaze When Viewing Statistical Graphs
    Harrison, Andre
    Livingston, Mark A.
    Brock, Derek
    Decker, Jonathan
    Perzanowski, Dennis
    Van Dolson, Christopher
    Mathews, Joseph
    Lulushi, Alexander
    Raglin, Adrienne
    AUGMENTED COGNITION: NEUROCOGNITION AND MACHINE LEARNING, AC 2017, PT I, 2017, 10284 : 148 - 165
  • [26] Prediction of upcoming speech under fluent and disfluent conditions: eye tracking evidence from immersive virtual reality
    Huizeling, Eleanor
    Peeters, David
    Hagoort, Peter
    LANGUAGE COGNITION AND NEUROSCIENCE, 2022, 37 (04) : 481 - 508
  • [27] Modeling Eye Gaze to Videos Using Dynamic Trajectory Variability Analysis
    Wu, Qianying
    Kim, Na Yeon
    Turner, Jasmin
    Keles, Umit
    Paul, Lynn
    Bzdok, Danilo
    Adolphs, Ralph
    BIOLOGICAL PSYCHIATRY, 2023, 93 (09) : S157 - S157
  • [28] Gaze Behaviour Analysis of Medicine Labels through Eye-tracking
    Pushkar, Vinayak
    INDIAHCI'18: PROCEEDINGS OF THE 9TH INDIAN CONFERENCE ON HUMAN COMPUTER INTERACTION, 2018, : 75 - 79
  • [29] Estimating User's Engagement from Eye-gaze Behaviors in Human-Agent Conversations
    Nakano, Yukiko I.
    Ishii, Ryo
    IUI 2010, 2010, : 139 - 148
  • [30] Eye Gaze Analysis Towards an AI System for Dynamic Content Layout
    Milliken, Michael
    Kharechko, Andriy
    Kegel, Ian
    Allan, Brahim
    Zhang, Shuai
    McClean, Sally
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2023 WORKSHOPS, PT I, 2024, 14365 : 145 - 156