Immediate use of prosody and context in predicting a syntactic structure

被引:31
作者
Nakamura, Chie [1 ,2 ,5 ]
Arai, Manabu [1 ,3 ,5 ]
Mazuka, Reiko [1 ,4 ]
机构
[1] RIKEN, Brain Sci Inst, Lab Language Dev, Wako, Saitama 3510918, Japan
[2] Keio Univ, Grad Sch Sci & Technol, Tokyo 108, Japan
[3] Univ Tokyo, Dept Language & Informat Sci, Tokyo 1138654, Japan
[4] Duke Univ, Durham, NC 27706 USA
[5] Japan Soc Promot Sci, Tokyo, Japan
关键词
Prosody; Contrastive intonation; Context; Prediction; Anticipatory eye-movements; Structural ambiguity; SPOKEN LANGUAGE; EYE-MOVEMENTS; RESOLUTION; TIME; COMPREHENSION; INTONATION; AMBIGUITY; JAPANESE;
D O I
10.1016/j.cognition.2012.07.016
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Numerous studies have reported an effect of prosodic information on parsing but whether prosody can impact even the initial parsing decision is still not evident. In a visual world eye-tracking experiment, we investigated the influence of contrastive intonation and visual context on processing temporarily ambiguous relative clause sentences in Japanese. Our results showed that listeners used the prosodic cue to make a structural prediction before hearing disambiguating information. Importantly, the effect was limited to cases where the visual scene provided an appropriate context for the prosodic cue, thus eliminating the explanation that listeners have simply associated marked prosodic information with a less frequent structure. Furthermore, the influence of the prosodic information was also evident following disambiguating information, in a way that reflected the initial analysis. The current study demonstrates that prosody, when provided with an appropriate context, influences the initial syntactic analysis and also the subsequent cost at disambiguating information. The results also provide first evidence for pre-head structural prediction driven by prosodic and contextual information with a head-final construction. (c) 2012 Elsevier B.V. All rights reserved.
引用
收藏
页码:317 / 323
页数:7
相关论文
共 27 条
[1]  
[Anonymous], ANN B RES I LOGOPEDI
[2]   Mixed-effects modeling with crossed random effects for subjects and items [J].
Baayen, R. H. ;
Davidson, D. J. ;
Bates, D. M. .
JOURNAL OF MEMORY AND LANGUAGE, 2008, 59 (04) :390-412
[3]   Analyzing 'visual world' eyetracking data using multilevel logistic regression [J].
Barr, Dale J. .
JOURNAL OF MEMORY AND LANGUAGE, 2008, 59 (04) :457-474
[4]   CONTROL OF EYE FIXATION BY MEANING OF SPOKEN LANGUAGE - NEW METHODOLOGY FOR REAL-TIME INVESTIGATION OF SPEECH PERCEPTION, MEMORY, AND LANGUAGE PROCESSING [J].
COOPER, RM .
COGNITIVE PSYCHOLOGY, 1974, 6 (01) :84-107
[5]  
Hale J, 2001, 2ND MEETING OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, PROCEEDINGS OF THE CONFERENCE, P159
[6]  
INouE A., 1995, JAPANESE SENTENCE PR, P9
[7]   Anticipatory effects of intonation: Eye movements during instructed visual search [J].
Ito, Kiwako ;
Speer, Shari R. .
JOURNAL OF MEMORY AND LANGUAGE, 2008, 58 (02) :541-573
[8]   Intonation facilitates contrast resolution: Evidence from Japanese adults and 6-year olds [J].
Ito, Kiwako ;
Jincho, Nobuyuki ;
Minai, Utako ;
Yamane, Naoto ;
Mazuka, Reiko .
JOURNAL OF MEMORY AND LANGUAGE, 2012, 66 (01) :265-284
[9]   The time-course of prediction in incremental sentence processing: Evidence from anticipatory eye movements [J].
Kamide, Y ;
Altmann, GTM ;
Haywood, SL .
JOURNAL OF MEMORY AND LANGUAGE, 2003, 49 (01) :133-156
[10]   Prosodic facilitation and interference in the resolution of temporary syntactic closure ambiguity [J].
Kjelgaard, MM ;
Speer, SR .
JOURNAL OF MEMORY AND LANGUAGE, 1999, 40 (02) :153-194