Segmented Analysis of Eye Gaze Behaviors of Fluent and Stuttered Speech

被引:0
|
作者
Hudock, Daniel [1 ]
Stuart, Andrew [2 ]
Saltuklaroglu, Tim [3 ]
Zhang, Jianliang [4 ]
Murray, Nicholas [5 ]
Kalinowski, Joseph [2 ]
Altieri, Nicholas [1 ]
机构
[1] Idaho State Univ, Dept Commun Sci & Disorders, Div Hlth Sci, Pocatello, ID 83209 USA
[2] E Carolina Univ, Coll Allied Hlth Sci, Dept Commun Sci & Disorders, Greenville, NC USA
[3] Univ Tennessee, Dept Audiol & Speech Pathol, Knoxville, TN USA
[4] N Carolina Cent Univ, Dept Allied Profess, Sch Educ, Durham, NC USA
[5] E Carolina Univ, Dept Kinesiol, Coll Hlth & Human Performance, Greenville, NC USA
来源
CANADIAN JOURNAL OF SPEECH-LANGUAGE PATHOLOGY AND AUDIOLOGY | 2015年 / 39卷 / 02期
关键词
STUTTER; STAMMER; FLUENCY; FLUENCY DISORDERS; COMMUNICATION; PERCEPTION; EYE GAZE; EYE TRACKING;
D O I
暂无
中图分类号
R49 [康复医学];
学科分类号
100215 ;
摘要
Purpose: To measure the effect of stuttering on eye-gaze in fluent speakers while viewing video presentations of typical speakers and people who stutter (PWS) speaking because eye-gaze behaviors provide indicators of emotion and communicative integrity. Method: Sixteen fluent college-age adults, naive to stuttering, observed six 30-second audiovisual speech samples of three PWS, and three age and gender matched controls who do not stutter (PWNS). A desk-mounted eye-tracker recorded the amount of time participants spent watching four regions of interest (ROIs) in the stimulus videos of PWS and PWNS: eyes, nose, mouth, and "outside" (i.e., any gaze-point not occurring within the eyes, nose, or mouth area). Proportions of gaze-time in each ROI were the dependent variables of interest in the study. Comparisons were made between proportions of time spent in each ROI for the PWS and PWNS speaker groups, and also between fluent versus disfluent speech segments produced by the PWS. Results: Participants spent significantly more time watching the eyes (e.g., maintaining eye-contact) when viewing PWNS than PWS. They also spent significantly more time observing mouth regions of PWS. When watching the videos of PWS, participants spent significantly more time observing nose and mouth regions when speech was stuttered (PWS-S) than when the speech was fluent (PWS-F). Conclusions: Overall, the difference in eye gaze patterns across speaker-group is interpreted to indicate negative emotional responses to stuttering. Current findings align with previous research showing that stuttered speech elicits negative reactions from listeners. Specifically, stuttering behaviors avert gaze from the eyes. Gaze aversion is a clear sign of disrupted communication that is visible to PWS and may contribute to their negative reactions to their own stuttering.
引用
收藏
页码:134 / 145
页数:12
相关论文
共 50 条
  • [41] Respiratory sinus arrhythmia and tympanic membrane compliance predict spontaneous eye gaze behaviors in young children: A pilot study
    Heilman, Keri J.
    Bal, Elgiz
    Bazhenova, Olga V.
    Porges, Stephen W.
    DEVELOPMENTAL PSYCHOBIOLOGY, 2007, 49 (05) : 531 - 542
  • [42] Eye Gaze Direction Detection Using Principal Component Analysis and Appearance Based Methods
    Yilmaz, Cagatay Murat
    Kose, Cemal
    2015 23RD SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2015, : 1050 - 1053
  • [43] Modeling Eye Gaze Patterns in Clinician-Patient Interaction With Lag Sequential Analysis
    Montague, Enid
    Xu, Jie
    Chen, Ping-yu
    Asan, Onur
    Barrett, Bruce P.
    Chewning, Betty
    HUMAN FACTORS, 2011, 53 (05) : 502 - 516
  • [44] Eye-Gaze Analysis of HUD Interventions for Conditional Automation to Increase Situation Awareness
    Gerber, Michael A.
    Schroeter, Ronald
    Johnson, Daniel
    Rakotonirainy, Andry
    AUTOMOTIVEUI '21 ADJUNCT PROCEEDINGS: 13TH INTERNATIONAL ACM CONFERENCE ON AUTOMOTIVE USER INTERFACES AND INTERACTIVE VEHICULAR APPLICATIONS, 2021, : 210 - 212
  • [45] A system for three-dimensional gaze fixation analysis using eye tracking glasses
    Takahashi, Ryo
    Suzuki, Hiromasa
    Chew, Jouh Yeong
    Ohtake, Yutaka
    Nagai, Yukie
    Ohtomi, Koichi
    JOURNAL OF COMPUTATIONAL DESIGN AND ENGINEERING, 2018, 5 (04) : 449 - 457
  • [46] An Integrated Eye-Tracking and Motion Capture System in Synchronized Gaze and Movement Analysis
    Wan, Zhao-Yuan
    Liu, Yi-Xing
    Zhang, Xiaochen
    Wang, Ruoli
    2023 INTERNATIONAL CONFERENCE ON REHABILITATION ROBOTICS, ICORR, 2023,
  • [47] A Low-Cost Implementation of an Eye Tracking System for Driver's Gaze Analysis
    Guasconi, Simone
    Porta, Marco
    Resta, Cristiano
    Rottenbacher, Carlo
    2017 10TH INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTIONS (HSI), 2017, : 264 - 269
  • [48] EYE GAZE ANALYSIS AND LEARNING-TO-RANK TO OBTAIN THE MOST PREFERRED RESULT IN IMAGE INPAINTING
    Isogawa, Mariko
    Mikami, Dan
    Takahashi, Kosuke
    Kojima, Akira
    2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 3538 - 3542
  • [49] Word order preference in sign influences speech in hearing bimodal bilinguals but not vice versa: Evidence from behavior and eye-gaze
    Manhardt, Francie
    Brouwer, Susanne
    van Wijk, Eveline
    Ozyurek, Asli
    BILINGUALISM-LANGUAGE AND COGNITION, 2023, 26 (01) : 48 - 61
  • [50] An audiovisual political speech analysis incorporating eye-tracking and perception data
    Scherer, Stefan
    Layher, Georg
    Kane, John
    Neumann, Heiko
    Campbell, Nick
    LREC 2012 - EIGHTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2012, : 1114 - 1120