Visual search is a basic behavior in animals that must forage for sustenance or reproduction. Under ordinary circumstances humans and non-human primates explore their visual world using eye movements, and the efficiency of visual search in humans is related to the number of and accuracy of those eye movements. The processes underlying visual search involve pure visual processing, the generation of visual attention, and ordinarily, the generation of eye movements. A brain network of visual, attentional, and oculomotor areas is involved in visual search, but little is known about the neural processes by which the primate brain accomplishes this critical process. The goal of this project is to understand the brain mechanisms underlying visual search by studying the activity of 3 nodes in the network, the lateral intraparietal area (LIP), the frontal eye field (FEF) and prestriate visual area V4. We have recently discovered that neurons in the LIP exhibit 3 different signals during visual search which are discernable in the activity of almost every neuron we have studied: 1) a bottom-up visual signal describing the abrupt appearance of an object in its response field but bearing no information about the direction of the impending saccade or the nature of the stimulus in the receptive field (e.g. whether it is the search target or a distractor);2) a saccadic signal which predicts the goal and reaction time of an impending eye movement;and 3) a cognitive signal which describes the nature of the object in its response field regardless of the direction of the impending eye movement. These signals sum in a predictable way to create a salience map used by the brain in visual search. LIP is connected both to oculomotor areas, to which it can send and receive information useful for the generation of eye movements, and to visual areas, to which the same information can be used to specify the locus of visual attention. V4 and the FEF project to LIP, and presumably contribute differently to 4. In these experiments we will study the activity of all 3 areas in visual search, and study what types of information each area contributes to the other two. We will train Rhesus monkeys on two visual search tasks, one in which they which they are free to move their eyes, and one in which must hold their eyes still, fixating a central point. Neural activity will be recorded while the monkeys perform the tasks..
The aims of the project are first to study the activity of neurons LIP, V4, and FEF, comparing activity in the free-viewing and fixation tasks as well as their activity in a simple visually guided delayed-saccade task.. The second is to study the effects of electrical stimulation in each area on the performance of the monkeys in the search tasks. The third is to study the effects of transient inactivation of each area on the performance of the tasks and the activity of neurons in the remaining areas. These experiments will provide insight into how the brain makes decisions about where to look during visual search, and how it transforms visual information into movement. Project Narrative Visual search is impaired in human patients with parietal and frontal lesions. Answering the questions posed in these specific aims will lead to a greater understanding of how the cerebral cortex orders the processes underlying visual search. This in turn will lead to insight into the visual and oculomotor deficits that are so devastating in humans with lesions of frontal and parietal cortex, and will aid the designing of diagnostic, prognostic, and rehabilitative strategies.
|Zhang, Wujie; Falkner, Annegret L; Krishna, B Suresh et al. (2016) Coupling between One-Dimensional Networks Reconciles Conflicting Dynamics in LIP and Reveals Its Recurrent Circuitry. Neuron :|
|Zhang, Mingsha; Wang, Xiaolan; Goldberg, Michael E (2014) A spatially nonselective baseline signal in parietal cortex reflects the probability of a monkey's success on the current trial. Proc Natl Acad Sci U S A 111:8967-72|
|Krishna, B Suresh; Ipata, Anna E; Bisley, James W et al. (2014) Extrafoveal preview benefit during free-viewing visual search in the monkey. J Vis 14:|
|Falkner, Annegret L; Goldberg, Michael E; Krishna, B Suresh (2013) Spatial representation and cognitive modulation of response variability in the lateral intraparietal area priority map. J Neurosci 33:16117-30|
|Steenrod, Sara C; Phillips, Matthew H; Goldberg, Michael E (2013) The lateral intraparietal area codes the location of saccade targets and not the dimension of the saccades that will be made to acquire them. J Neurophysiol 109:2596-605|
|Ipata, Anna E; Gee, Angela L; Goldberg, Michael E (2012) Feature attention evokes task-specific pattern selectivity in V4 neurons. Proc Natl Acad Sci U S A 109:16778-85|
|Xu, Benjamin Y; Karachi, Carine; Goldberg, Michael E (2012) The postsaccadic unreliability of gain fields renders it unlikely that the motor system can use them to calculate target position in space. Neuron 76:1201-9|
|Xu, Yixing; Wang, Xiaolan; Peck, Christopher et al. (2011) The time course of the tonic oculomotor proprioceptive signal in area 3a of somatosensory cortex. J Neurophysiol 106:71-7|
|Falkner, Annegret L; Krishna, B Suresh; Goldberg, Michael E (2010) Surround suppression sharpens the priority map in the lateral intraparietal area. J Neurosci 30:12787-97|
|Gee, Angela L; Ipata, Anna E; Goldberg, Michael E (2010) Activity in V4 reflects the direction, but not the latency, of saccades during visual search. J Neurophysiol 104:2187-93|
Showing the most recent 10 out of 12 publications