Synthetic cues (e.g., arrows and boxes) predictive of a target location speed up search times and result in increased decision accuracy. Similarly, when observers search in natural scenes, a highly visible object (e.g., house) that often co-occurs in natural environments with a sought target (e.g., chimney) will influence eye movements and facilitate search when the target appears close to the object. While the last few decades have seen single cell neurophysiology, human electrophysiology and neuroimaging lead to great advances in the understanding of the effects of attention and synthetic cues on neural activity, little is known about the underlying neural mechanisms mediating context effects during visual search in real scenes. Here, we propose to separately measure neural activity using functional magnetic resonance imaging (fMRI) while observers search for targets in real scenes and use neural decoding methods (multivariate pattern analysis) and a novel variation of population receptive field methods to: 1) Determine the brain areas (fMRI) that represent the spatial location of scene context and thus might mediate guidance of search in real scenes;2) To evaluate whether the coding of scene context is automatic or whether it is modulated by top-down visual attention. The proposed work will improve our understanding of the neural mechanisms of scene context which arguably is one of the most important strategies used by observers to optimize visual search in natural environments. Our results will also advance our understanding of the function and role of brain regions related to attention, objects/scenes, and contextual associations for visual search. These advances might potentially help in identifying neural correlates of poor behavioral performance for patients with low-vision and attentional deficits in an ecologically important task such as visual search in real scenes.

Public Health Relevance

When searching for objects, humans use knowledge of the layout of the scene to guide eye movements and facilitate finding the sought object. We seek to use non-invasive measures of human neural activity to identify the brain mechanisms that mediate this guidance by scene context. The tools and framework developed could potentially lead to assessment of neural signatures of patients with visual and attentional deficits.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Exploratory/Developmental Grants (R21)
Project #
1R21EY023097-01
Application #
8436142
Study Section
Cognition and Perception Study Section (CP)
Program Officer
Steinmetz, Michael A
Project Start
2013-03-01
Project End
2015-02-28
Budget Start
2013-03-01
Budget End
2014-02-28
Support Year
1
Fiscal Year
2013
Total Cost
$229,500
Indirect Cost
$79,500
Name
University of California Santa Barbara
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
094878394
City
Santa Barbara
State
CA
Country
United States
Zip Code
93106
Eckstein, Miguel P; Schoonveld, Wade; Zhang, Sheng et al. (2015) Optimal and human eye movements to clustered low value cues to increase decision rewards during search. Vision Res 113:137-54
Koehler, Kathryn; Guo, Fei; Zhang, Sheng et al. (2014) What do saliency models predict? J Vis 14:14
Preston, Tim J; Guo, Fei; Das, Koel et al. (2013) Neural representations of contextual guidance in visual search of real-world scenes. J Neurosci 33:7846-55