An ultimate goal of vision science is to understand and predict performance under natural conditions, and there is arguably no task more ubiquitous and fundamental than visually searching the environment for objects of interest. However, little is known about search in natural scenes.
Our aim i s to measure, characterize, and model search performance in natural scenes in both humans and non-human primates. We will study both covert search, where the search scene is presented for a limited time and the eyes remain on a fixation target, and overt search, which involves eye movements. Our approach combines mathematical/computational analysis, psychophysical and eye-movement measurements in humans and monkeys, and voltage sensitive dye imaging (VSDI) in V1 of behaving monkeys. We argue that a rigorous understanding of visual search must begin with a characterization of the sensory factors that control the detectability of the target across the visual field in absence of any uncertainty about the location of the target. In preliminary psychophysical, neurophysiological, and computational studies we developed a model of target detectability that predicts simple detection performance in natural images. In further computational studies (based on the detection results), we discovered a biologically-plausible computation that would allow the brain to perform optimal overt search in natural scenes. Specifically, we prove that during a visual search, the optimal next fixation location is obtained from the current belief map (posterior probability map) of where the target might be located, by (i) dividing (normalizing) the map by the local image contrast, (ii) blurring this contrast-normalized map appropriately, and then (iii) selecting the peak of the blurred map as the next fixation location. The proposed studies will measure human and monkey detection and visual search performance in natural images, and will measure VSDI responses in monkeys during fixation and detection tasks. The studies will let us test and further develop our models of simple detection and covert search in natural images, and determine which components of optimal fixation search humans use when searching in natural images. They may also lead to methods for training humans to use more optimal search strategies.

Public Health Relevance

Ultimate goals of vision science are to understand vision in the real world and to mitigate the effects of visual dysfunction on real-world performance. The proposed studies to develop a model of overt and covert visual search performance in natural scenes will provide rigorous steps toward those ultimate goals and may lead to improved design of low-vision displays, environments, and training procedures.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY024662-03
Application #
9331655
Study Section
Mechanisms of Sensory, Perceptual, and Cognitive Processes Study Section (SPC)
Program Officer
Wiggs, Cheri
Project Start
2015-09-01
Project End
2019-08-31
Budget Start
2017-09-01
Budget End
2018-08-31
Support Year
3
Fiscal Year
2017
Total Cost
Indirect Cost
Name
University of Texas Austin
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
170230239
City
Austin
State
TX
Country
United States
Zip Code
78759
Seidemann, Eyal; Geisler, Wilson S (2018) Linking V1 Activity to Behavior. Annu Rev Vis Sci 4:287-310
Geisler, Wilson S (2018) Psychometric functions of uncertain template matching observers. J Vis 18:1
Sebastian, Stephen; Geisler, Wilson S (2018) Decision-variable correlation. J Vis 18:3
Michel, Melchi M; Chen, Yuzhi; Seidemann, Eyal et al. (2018) Nonlinear Lateral Interactions in V1 Population Responses Explained by a Contrast Gain Control Model. J Neurosci 38:10069-10079
Sebastian, Stephen; Abrams, Jared; Geisler, Wilson S (2017) Constrained sampling experiments reveal principles of detection in natural scenes. Proc Natl Acad Sci U S A 114:E5731-E5740
Seidemann, Eyal; Chen, Yuzhi; Bai, Yoon et al. (2016) Calcium imaging with genetically encoded indicators in behaving primates. Elife 5: