Modeling the Effects of Perceptual Load: Saliency, Competitive Interactions, and Top-Down Biases

被引:10
|
作者
Neokleous, Kleanthis [1 ,2 ]
Shimi, Andria [3 ]
Avraamides, Marios N. [1 ,4 ]
机构
[1] Univ Cyprus, Dept Psychol, CY-1678 Nicosia, Cyprus
[2] Univ Cyprus, Dept Comp Sci, CY-1678 Nicosia, Cyprus
[3] Univ Oxford, Dept Expt Psychol, S Parks Rd, Oxford OX1 3UD, England
[4] Univ Cyprus, Ctr Appl Neurosci, CY-1678 Nicosia, Cyprus
来源
FRONTIERS IN PSYCHOLOGY | 2016年 / 7卷
关键词
perceptual load; selective attention; distractor interference; dilution; SELECTIVE VISUAL-ATTENTION; SPATIAL ATTENTION; WORKING-MEMORY; NEURONAL SYNCHRONIZATION; NEURAL MECHANISMS; AREA V4; CORTEX; SEARCH; CAPTURE; BRAIN;
D O I
10.3389/fpsyg.2016.00001
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
A computational model of visual selective attention has been implemented to account for experimental findings on the Perceptual Load Theory (PLT) of attention. The model was designed based on existing neurophysiological findings on attentional processes with the objective to offer an explicit and biologically plausible formulation of PLT. Simulation results verified that the proposed model is capable of capturing the basic pattern of results that support the PLT as well as findings that are considered contradictory to the theory. Importantly, the model is able to reproduce the behavioral results from a dilution experiment, providing thus a way to reconcile PLT with the competing Dilution account. Overall, the model presents a novel account for explaining PLT effects on the basis of the low-level competitive interactions among neurons that represent visual input and the top-down signals that modulate neural activity. The implications of the model concerning the debate on the locus of selective attention as well as the origins of distractor interference in visual displays of varying load are discussed.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Modeling the Interactions of Bottom-Up and Top-Down Guidance in Visual Attention
    Henderickx, David
    Maetens, Kathleen
    Geerinck, Thomas
    Soetens, Eric
    ATTENTION IN COGNITIVE SYSTEMS, 2009, 5395 : 197 - +
  • [2] Social anxiety and attentional biases: A top-down contribution?
    Boal, Hannah L.
    Christensen, B. K.
    Goodhew, S. C.
    ATTENTION PERCEPTION & PSYCHOPHYSICS, 2018, 80 (01) : 42 - 53
  • [3] Top-down effects of semantic knowledge in visual search are modulated by cognitive but not perceptual load
    Eva Belke
    Glyn W. Humphreys
    Derrick G. Watson
    Antje S. Meyer
    Anna L. Telling
    Perception & Psychophysics, 2008, 70 : 1444 - 1458
  • [4] Social anxiety and attentional biases: A top-down contribution?
    Hannah L. Boal
    B. K. Christensen
    S. C. Goodhew
    Attention, Perception, & Psychophysics, 2018, 80 : 42 - 53
  • [5] The Perceptual and Functional Consequences of Parietal Top-Down Modulation on the Visual Cortex
    Silvanto, Juha
    Muggleton, Neil
    Lavie, Nilli
    Walsh, Vincent
    CEREBRAL CORTEX, 2009, 19 (02) : 327 - 330
  • [6] Top-down specific preparatory activations for selective attention and perceptual expectations
    Penalver, Jose M. G.
    Lopez-Garcia, David
    Gonzalez-Garcia, Carlos
    Aguado-Lopez, Blanca
    Gorriz, Juan M.
    Ruz, Maria
    NEUROIMAGE, 2023, 271
  • [7] Top-Down Activation of Spatiotopic Sensory Codes in Perceptual and Working Memory Search
    Kuo, Bo-Cheng
    Nobre, Anna Christina
    Scerif, Gaia
    Astle, Duncan E.
    JOURNAL OF COGNITIVE NEUROSCIENCE, 2016, 28 (07) : 996 - 1009
  • [8] SUN: Top-down saliency using natural statistics
    Kanan, Christopher
    Tong, Mathew H.
    Zhang, Lingyun
    Cottrell, Garrison W.
    VISUAL COGNITION, 2009, 17 (6-7) : 979 - 1003
  • [9] Perceptual Salience Does Not Influence Emotional Arousal's Impairing Effects on Top-Down Attention
    Sutherland, Matthew R.
    McQuiggan, Douglas A.
    Ryan, Jennifer D.
    Mather, Mara
    EMOTION, 2017, 17 (04) : 700 - 706
  • [10] Interactions between top-down and bottom-up attention in barn owls (Tyto alba)
    Lev-Ari, Tidhar
    Gutfreund, Yoram
    ANIMAL COGNITION, 2018, 21 (02) : 197 - 205