Visual search is a basic behavior in animals that must forage for sustenance or reproduction. Under ordinary circumstances humans and non-human primates explore their visual world using eye movements, and the efficiency of visual search in humans is related to the number of and accuracy of those eye movements. The processes underlying visual search involve pure visual processing, the generation of visual attention, and ordinarily, the generation of eye movements. A brain network of visual, attentional, and oculomotor areas is involved in visual search, but little is known about the neural processes by which the primate brain accomplishes this critical process. The goal of this project is to understand the brain mechanisms underlying visual search by studying the activity of 3 nodes in the network, the lateral intraparietal area (LIP), the frontal eye field (FEF) and prestriate visual area V4. We have recently discovered that neurons in the LIP exhibit 3 different signals during visual search which are discernable in the activity of almost every neuron we have studied: 1) a bottom-up visual signal describing the abrupt appearance of an object in its response field but bearing no information about the direction of the impending saccade or the nature of the stimulus in the receptive field (e.g. whether it is the search target or a distractor);2) a saccadic signal which predicts the goal and reaction time of an impending eye movement;and 3) a cognitive signal which describes the nature of the object in its response field regardless of the direction of the impending eye movement. These signals sum in a predictable way to create a salience map used by the brain in visual search. LIP is connected both to oculomotor areas, to which it can send and receive information useful for the generation of eye movements, and to visual areas, to which the same information can be used to specify the locus of visual attention. V4 and the FEF project to LIP, and presumably contribute differently to 4. In these experiments we will study the activity of all 3 areas in visual search, and study what types of information each area contributes to the other two. We will train Rhesus monkeys on two visual search tasks, one in which they which they are free to move their eyes, and one in which must hold their eyes still, fixating a central point. Neural activity will be recorded while the monkeys perform the tasks..
The aims of the project are first to study the activity of neurons LIP, V4, and FEF, comparing activity in the free-viewing and fixation tasks as well as their activity in a simple visually guided delayed-saccade task.. The second is to study the effects of electrical stimulation in each area on the performance of the monkeys in the search tasks. The third is to study the effects of transient inactivation of each area on the performance of the tasks and the activity of neurons in the remaining areas. These experiments will provide insight into how the brain makes decisions about where to look during visual search, and how it transforms visual information into movement. Project Narrative Visual search is impaired in human patients with parietal and frontal lesions. Answering the questions posed in these specific aims will lead to a greater understanding of how the cerebral cortex orders the processes underlying visual search. This in turn will lead to insight into the visual and oculomotor deficits that are so devastating in humans with lesions of frontal and parietal cortex, and will aid the designing of diagnostic, prognostic, and rehabilitative strategies.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY017039-03
Application #
7804503
Study Section
Central Visual Processing Study Section (CVP)
Program Officer
Steinmetz, Michael A
Project Start
2008-05-01
Project End
2013-04-30
Budget Start
2010-05-01
Budget End
2011-04-30
Support Year
3
Fiscal Year
2010
Total Cost
$369,185
Indirect Cost
Name
Columbia University (N.Y.)
Department
Neurosciences
Type
Schools of Medicine
DUNS #
621889815
City
New York
State
NY
Country
United States
Zip Code
10032
Semework, Mulugeta; Steenrod, Sara C; Goldberg, Michael E (2018) A spatial memory signal shows that the parietal cortex has access to a craniotopic representation of space. Elife 7:
Sendhilnathan, Naveen; Basu, Debaleena; Murthy, Aditya (2017) Simultaneous analysis of the LFP and spiking activity reveals essential components of a visuomotor transformation in the frontal eye field. Proc Natl Acad Sci U S A 114:6370-6375
Wang, Xiaodong; Guo, Xiaotao; Chen, Lin et al. (2017) Auditory to Visual Cross-Modal Adaptation for Emotion: Psychophysical and Neural Correlates. Cereb Cortex 27:1337-1346
Zhang, Wujie; Falkner, Annegret L; Krishna, B Suresh et al. (2017) Coupling between One-Dimensional Networks Reconciles Conflicting Dynamics in LIP and Reveals Its Recurrent Circuitry. Neuron 93:221-234
Sun, Linus D; Goldberg, Michael E (2016) Corollary Discharge and Oculomotor Proprioception: Cortical Mechanisms for Spatially Accurate Vision. Annu Rev Vis Sci 2:61-84
Wang, Xiaolan; Fung, C C Alan; Guan, Shaobo et al. (2016) Perisaccadic Receptive Field Expansion in the Lateral Intraparietal Area. Neuron 90:400-9
Krishna, B Suresh; Ipata, Anna E; Bisley, James W et al. (2014) Extrafoveal preview benefit during free-viewing visual search in the monkey. J Vis 14:
Zhang, Mingsha; Wang, Xiaolan; Goldberg, Michael E (2014) A spatially nonselective baseline signal in parietal cortex reflects the probability of a monkey's success on the current trial. Proc Natl Acad Sci U S A 111:8967-72
Falkner, Annegret L; Goldberg, Michael E; Krishna, B Suresh (2013) Spatial representation and cognitive modulation of response variability in the lateral intraparietal area priority map. J Neurosci 33:16117-30
Steenrod, Sara C; Phillips, Matthew H; Goldberg, Michael E (2013) The lateral intraparietal area codes the location of saccade targets and not the dimension of the saccades that will be made to acquire them. J Neurophysiol 109:2596-605

Showing the most recent 10 out of 18 publications