People make tens of thousands of saccadic eye movements each day, and a substantial proportion of these fail to land on the intended object. To be efficient, the visual system must rapidly correct these errors and make a corrective saccade to the intended object. Previous research has shown that gaze correction is fast and automatic under simple laboratory conditions, in which only a single object is present. In the natural world, however, there are many possible gaze targets, and determining which object was the original saccade target will often require the use of a memory representation of the target object. However, no previous studies have attempted to demonstrate that memory is used to correct saccade errors. In the proposed research, we plan to show that memory for the saccade target is indeed used to correct saccade errors, and that these corrections are fast, accurate, automatic, and perhaps even unconscious. Our central hypothesis is that visual short-term memory (VSTM) stores target information across the saccade so that the target object can be discriminated from other visible objects and gaze can by efficiently corrected. To test this idea, we developed a new paradigm to simulate saccade error, in which a stimulus array is shifted slightly during a saccade so that the eyes land between the target object and a distractor object. Our preliminary results suggest that memory-based gaze correction under these conditions is accurate, fast, and automatic. The proposed work will explore the nature of the memory representations used for these gaze corrections, including (a) testing whether the memory representations are stored in VSTM, (b) examining what information is stored in these representations, (c) exploring how these representations are combined bottom-up sensory information, and (d) determining whether the representations include information about other nearby objects. These studies will advance our understanding of basic mechanisms of gaze correction that support efficient interaction with objects and agents in the world. Most complex visual tasks (preparing food, driving) depend on fixating a series of objects, and gaze correction plays an important role in ensuring that the eyes are directed efficiently to goal-relevant objects. Thus, the proposed research will provide essential information for understanding conditions that involve deficits in the control of gaze. ? ? ?

National Institute of Health (NIH)
National Eye Institute (NEI)
Research Project (R01)
Project #
Application #
Study Section
Cognition and Perception Study Section (CP)
Program Officer
Hunter, Chyren
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
University of Iowa
Schools of Arts and Sciences
Iowa City
United States
Zip Code
Beck, Valerie M; Luck, Steven J; Hollingworth, Andrew (2018) Whatever you do, don't look at the...: Evaluating guidance by an exclusionary attentional template. J Exp Psychol Hum Percept Perform 44:645-662
Bahle, Brett; Matsukura, Michi; Hollingworth, Andrew (2018) Contrasting gist-based and template-based guidance during real-world visual search. J Exp Psychol Hum Percept Perform 44:367-386
Van der Stigchel, Stefan; Hollingworth, Andrew (2018) Visuospatial Working Memory as a Fundamental Component of the Eye Movement System. Curr Dir Psychol Sci 27:136-143
Bahle, Brett; Beck, Valerie M; Hollingworth, Andrew (2018) The architecture of interaction between visual working memory and visual attention. J Exp Psychol Hum Percept Perform 44:992-1011
Beck, Valerie M; Hollingworth, Andrew (2017) Competition in saccade target selection reveals attentional guidance by simultaneously active working memory representations. J Exp Psychol Hum Percept Perform 43:225-230
Tas, A Caglar; Luck, Steven J; Hollingworth, Andrew (2016) The relationship between visual attention and visual working memory encoding: A dissociation between covert and overt orienting. J Exp Psychol Hum Percept Perform 42:1121-1138
Hollingworth, Andrew; Beck, Valerie M (2016) Memory-based attention capture when multiple items are maintained in visual working memory. J Exp Psychol Hum Percept Perform 42:911-7
Hollingworth, Andrew (2015) Visual working memory modulates within-object metrics of saccade landing position. Ann N Y Acad Sci 1339:11-9
Beck, Valerie M; Hollingworth, Andrew (2015) Evidence for negative feature guidance in visual search is explained by spatial recoding. J Exp Psychol Hum Percept Perform 41:1190-6
Tas, A Caglar; Moore, Cathleen M; Hollingworth, Andrew (2014) The representation of the saccade target object depends on visual stability. Vis cogn 22:1042-1046

Showing the most recent 10 out of 33 publications