Motion opponency examined throughout visual cortex with multivariate pattern analysis offMRIdata

被引:5
作者
Silva, Andrew E. [1 ,2 ]
Thompson, Benjamin [1 ]
Liu, Zili [2 ]
机构
[1] Univ Waterloo, Sch Optometry & Vis Sci, 200 Columbia St W, Waterloo, ON N2L 3G1, Canada
[2] Univ Calif Los Angeles, Dept Psychol, Los Angeles, CA USA
基金
美国国家科学基金会; 加拿大自然科学与工程研究理事会;
关键词
hMT plus; motion perception; MVPA; noise reduction; V3A; V5; PERCEPTION; AREAS; DISCRIMINATION; RESPONSES; SUBJECT; SIGNALS; DESIGNS; MODEL; FMRI;
D O I
10.1002/hbm.25198
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
This study explores how the human brain solves the challenge of flicker noise in motion processing. Despite providing no useful directional motion information, flicker is common in the visual environment and exhibits omnidirectional motion energy which is processed by low-level motion detectors. Models of motion processing propose a mechanism called motion opponency that reduces flicker processing. Motion opponency involves the pooling of local motion signals to calculate an overall motion direction. A neural correlate of motion opponency has been observed in human area MT+/V5, whereby stimuli with perfectly balanced motion energy constructed from dots moving in counter-phase elicit a weaker response than nonbalanced (in-phase) motion stimuli. Building on this previous work, we used multivariate pattern analysis to examine whether the activation patterns elicited by motion opponent stimuli resemble that elicited by flicker noise across the human visual cortex. Robust multivariate signatures of opponency were observed in V5 and in V3A. Our results support the notion that V5 is centrally involved in motion opponency and in the reduction of flicker. Furthermore, these results demonstrate the utility of multivariate analysis methods in revealing the role of additional visual areas, such as V3A, in opponency and in motion processing more generally.
引用
收藏
页码:5 / 13
页数:9
相关论文
共 38 条
[21]   Deconvolving BOLD activation in event-related designs for multivoxel pattern classification analyses [J].
Mumford, Jeanette A. ;
Turner, Benjamin O. ;
Ashby, F. Gregory ;
Poldrack, Russell A. .
NEUROIMAGE, 2012, 59 (03) :2636-2643
[22]   Revealing representational content with pattern-information fMRIan introductory guide [J].
Mur, Marieke ;
Bandettini, Peter A. ;
Kriegeskorte, Nikolaus .
SOCIAL COGNITIVE AND AFFECTIVE NEUROSCIENCE, 2009, 4 (01) :101-109
[23]   Beyond mind-reading: multi-voxel pattern analysis of fMRI data [J].
Norman, Kenneth A. ;
Polyn, Sean M. ;
Detre, Greg J. ;
Haxby, James V. .
TRENDS IN COGNITIVE SCIENCES, 2006, 10 (09) :424-430
[24]  
Pedregosa F, 2011, J MACH LEARN RES, V12, P2825
[25]   PsychoPy - Psychophysics software in Python']Python [J].
Peirce, Jonathan W. .
JOURNAL OF NEUROSCIENCE METHODS, 2007, 162 (1-2) :8-13
[26]   Generating stimuli for neuroscience using PsychoPy [J].
Peirce, Jonathan W. .
FRONTIERS IN NEUROINFORMATICS, 2009, 2
[27]  
QIAN N, 1994, J NEUROSCI, V14, P7367
[28]  
QIAN N, 1994, J NEUROSCI, V14, P7381
[29]  
Reichardt W., 1961, SENSORY COMMUNICATIO
[30]   Measuring functional connectivity during distinct stages of a cognitive task [J].
Rissman, J ;
Gazzaley, A ;
D'Esposito, M .
NEUROIMAGE, 2004, 23 (02) :752-763