See what I hear? Beat perception in auditory and visual rhythms

被引:0
|
作者
Jessica A. Grahn
机构
[1] Brain and Mind Institute,Department of Psychology
[2] University of Western Ontario,undefined
来源
Experimental Brain Research | 2012年 / 220卷
关键词
Music; Rhythm; Timing; Auditory perception; Visual perception; Cross-modal comparisons;
D O I
暂无
中图分类号
学科分类号
摘要
Our perception of time is affected by the modality in which it is conveyed. Moreover, certain temporal phenomena appear to exist in only one modality. The perception of temporal regularity or structure (e.g., the ‘beat’) in rhythmic patterns is one such phenomenon: visual beat perception is rare. The modality-specificity for beat perception is puzzling, as the durations that comprise rhythmic patterns are much longer than the limits of visual temporal resolution. Moreover, the optimization that beat perception provides for memory of auditory sequences should be equally relevant to visual sequences. Why does beat perception appear to be modality specific? One possibility is that the nature of the visual stimulus plays a role. Previous studies have usually used brief stimuli (e.g., light flashes) to present visual rhythms. In the current study, a rotating line that appeared sequentially in different spatial orientations was used to present a visual rhythm. Discrimination accuracy for visual rhythms and auditory rhythms was compared for different types of rhythms. The rhythms either had a regular temporal structure that previously has been shown to induce beat perception in the auditory modality, or they had an irregular temporal structure without beat-inducing qualities. Overall, the visual rhythms were discriminated more poorly than the auditory rhythms. The beat-based structure, however, increased accuracy for visual as well as auditory rhythms. These results indicate that beat perception can occur in the visual modality and improve performance on a temporal discrimination task, when certain types of stimuli are used.
引用
收藏
页码:51 / 61
页数:10
相关论文
共 50 条
  • [21] "See what I mean, huh?" Evaluating Visual Inspection of F0 Tracking in Nasal Grunts
    Chlebowski, Aurelie
    Ballier, Nicolas
    INTERSPEECH 2021, 2021, : 376 - 380
  • [22] Beta-Band Oscillations Represent Auditory Beat and Its Metrical Hierarchy in Perception and Imagery
    Fujioka, Takako
    Ross, Bernhard
    Trainor, Laurel J.
    JOURNAL OF NEUROSCIENCE, 2015, 35 (45): : 15187 - 15198
  • [23] Auditory emotional cues enhance visual perception
    Zeelenberg, Rene
    Bocanegra, Bruno R.
    COGNITION, 2010, 115 (01) : 202 - 206
  • [24] Do you see what I'm singing? Visuospatial movement biases pitch perception
    Connell, Louise
    Cai, Zhenguang G.
    Holler, Judith
    BRAIN AND COGNITION, 2013, 81 (01) : 124 - 130
  • [25] What is a Rhythm for the Brain? The Impact of Contextual Temporal Variability on Auditory Perception
    Bonnet, Pierre
    Bonnefond, Mathilde
    Kosem, Anne
    JOURNAL OF COGNITION, 2024, 7 (01):
  • [26] Wearing prisms to hear differently: After-effects of prism adaptation on auditory perception
    Michel, Carine
    Bonnet, Clernence
    Podor, Baptiste
    Bard, Patrick
    Poulin-Charronnat, Benedicte
    CORTEX, 2019, 115 : 123 - 132
  • [27] A COMPARATIVE STUDY OF VISUAL AND AUDITORY PERCEPTION OF EMOTIONS IN CHILDREN OF PRIMARY SCHOOL AGE
    Dmitrieva, E. S.
    Anderson, M. N.
    Gelman, V. Ya
    EKSPERIMENTALNAYA PSIKHOLOGIYA, 2016, 9 (01): : 38 - 52
  • [28] Prominence of delta oscillatory rhythms in the motor cortex and their relevance for auditory and speech perception
    Morillon, Benjamin
    Arnal, Luc H.
    Schroeder, Charles E.
    Keitel, Anne
    NEUROSCIENCE AND BIOBEHAVIORAL REVIEWS, 2019, 107 : 136 - 142
  • [29] Seeing what you hear: Visual feedback improves pitch recognition
    Eldridge, Marcus
    Saltzman, Elliot
    Lahav, Amir
    EUROPEAN JOURNAL OF COGNITIVE PSYCHOLOGY, 2010, 22 (07): : 1078 - 1091
  • [30] What can we learn about beat perception by comparing brain signals and stimulus envelopes?
    Henry, Molly J.
    Herrmann, Bjorn
    Grahn, Jessica A.
    PLOS ONE, 2017, 12 (02):