Visual search has typically been studied using button-press dependent measures and fairly simple stimuli, methodological choices that have rendered current theories of search poorly equipped to predict the locations and durations of individual search movements to featurally complex real-world objects. In response to these limitations, the goals of the proposed project are to: (1) describe real-world visual search in terms of directly observable and spatio-temporally exact eye movement behavior, and (2) introduce a computational model capable of accommodating real-world oculomotor search. Work on this interdisciplinary project will be accomplished in three phases. Phase I will implement the computational model. Filter-based image processing techniques will be used to represent the real-world search stimuli, and visual routines acting on these representations will endow the model with simulated oculomotor behavior. The visual information available to each of these eye movements will be constrained by a simulated fovea that moves over the scene as the model's """"""""eye"""""""" gradually converges on the search target. Phase 2 will apply this behavioral and computational approach to address basic questions regarding real-world oculomotor search (set size effects, target presence/absence, etc). Behavioral studies will determine how people direct their gaze as they search for simple and real-world targets. Computational studies will then input to the model the same search scenes viewed by the human observers and compare the simulated eye movement behavior to the sequence of saccades and fixations obtained from the behavioral studies. Phase 3 will build on the results of Phase 2 by using the gaze patterns predicted by the model to test several novel questions regarding search processes and representations (e.g., What is the role of complex backgrounds in a search task? Should search items be treated as objects or spatially extensive image patches?, What are the relationships between visual search and memory?). Finding spatio-temporal agreement between human and simulated gaze patterns in these studies will not only provide the literature with a validated computational model of oculomotor search, but also open to researchers a (real) world of stimuli to challenge our understanding of visual search behavior.

Agency
National Institute of Health (NIH)
Institute
National Institute of Mental Health (NIMH)
Type
Research Project (R01)
Project #
5R01MH063748-04
Application #
6931237
Study Section
Biobehavioral and Behavioral Processes 3 (BBBP)
Program Officer
Kurtzman, Howard S
Project Start
2002-09-05
Project End
2007-07-31
Budget Start
2005-08-01
Budget End
2006-07-31
Support Year
4
Fiscal Year
2005
Total Cost
$112,875
Indirect Cost
Name
State University New York Stony Brook
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
804878247
City
Stony Brook
State
NY
Country
United States
Zip Code
11794
Alexander, Robert G; Zelinsky, Gregory J (2018) Occluded information is restored at preview but not during visual search. J Vis 18:4
Schmidt, Joseph; Zelinsky, Gregory J (2017) Adding details to the attentional template offsets search difficulty: Evidence from contralateral delay activity. J Exp Psychol Hum Percept Perform 43:429-437
Yu, Chen-Ping; Maxfield, Justin T; Zelinsky, Gregory J (2016) Searching for Category-Consistent Features: A Computational Approach to Understanding Visual Category Representation. Psychol Sci 27:870-84
Zelinsky, Gregory J; Yu, Chen-Ping (2015) Clutter perception is invariant to image size. Vision Res 116:142-51
Zelinsky, Gregory J; Bisley, James W (2015) The what, where, and why of priority maps and their interactions with visual working memory. Ann N Y Acad Sci 1339:154-64
Schmidt, Joseph; MacNamara, Annmarie; Proudfit, Greg Hajcak et al. (2014) More target features in visual working memory leads to poorer search guidance: evidence from contralateral delay activity. J Vis 14:8
Alexander, Robert G; Schmidt, Joseph; Zelinsky, Gregory J (2014) Are summary statistics enough? Evidence for the importance of shape in guiding visual search. Vis cogn 22:595-609
Maxfield, Justin T; Stalder, Westri D; Zelinsky, Gregory J (2014) Effects of target typicality on categorical search. J Vis 14:
Yu, Chen-Ping; Samaras, Dimitris; Zelinsky, Gregory J (2014) Modeling visual clutter perception using proto-object segmentation. J Vis 14:
Zelinsky, Gregory J; Adeli, Hossein; Peng, Yifan et al. (2013) Modelling eye movements in a categorical search task. Philos Trans R Soc Lond B Biol Sci 368:20130058

Showing the most recent 10 out of 31 publications