The goal of this proposal is to understand human vision in the context of natural behavior. This is of fundamental importance because relatively little is known about how vision functions in the natural world, and many important issues arise in this setting that are absent, or difficult to address, in standard paradigms. It is our contention that understanding many aspects of vision, such as selective attention and control of gaze will be impossible without investigation of vision in its natural context. Previous attempts to explain gaze patterns have almost exclusively concerned only static, restricted stimulus conditions, and focused on the properties of the stimulus. Such models cannot extend to natural behavior where the visual input is dynamic, and where the observer's behavioral goals play a dominant role. The pervasive effect of reward in the neural circuitry underlying saccadic eye movements, the development of the mathematics of Reinforcement Learning, and the application of statistical decision theory to understanding sensory-motor behavior all allow a novel framework for understanding the sequential acquisition of visual information in the context of normal behavior. We explore evidence for this framework in the proposal. Specifically, we examine the role of reward, uncertainty, and prior knowledge in the control of gaze in the natural world, with the goal of providing a formal structure for understanding complex behavioral sequences. The research focuses on dynamic environments where a central open question is how the visual system balances the need to attend to existing goals against the need to maintaining sensitivity to new information that may pose opportunities or threats. We investigate the role of reward, uncertainty, and prior knowledge in control of gaze in experiments in an immersive virtual walking environment. We then test the predictions of a model based on reinforcement learning in a simple divided attention task. We then use recently developed methods of Inverse Reinforcement Learning to estimate the intrinsic rewards of different behavioral goals in the walking environment. This will allow us, for the first time, to infer intrinsic human rewards directly, and to predict gaze sequences in novel situations. We also explore the nature and complexity of prior knowledge and the role of prediction in intercepting moving objects in a virtual environment. Although investigation of natural behavior is challenging, the potential benefit is that it provides the empirical basis and theoretical tools for understanding how complex behavioral sequences are generated. The experiments will lay the groundwork for investigation of clinical populations in contexts such as walking, and will also have direct relevance to safety issues in driving and any situation involving multi-tasking.
This grant explores the use of vision and the control of eye movements in the context of natural, visually guided behavior. The present experiments help define what tasks subjects need to perform when walking, and what information might be needed. This is a necessary first step that will lay the groundwork for investigation of clinical populations, since patient data needs to be interpreted in the context of normal performance. The experiments also have direct relevance to safety issues in driving and any situation involving multi-tasking. The development of virtual environments for natural tasks is important as it allows us to safely investigate situations that might be dangerous to test in the real world.
|Li, Chia-Ling; Aivar, M Pilar; Kit, Dmitry M et al. (2016) Memory and visual search in naturalistic 2D and 3D environments. J Vis 16:9|
|Boucart, Muriel; Delerue, Celine; Thibaut, Miguel et al. (2015) Impact of Wet Macular Degeneration on the Execution of Natural Actions. Invest Ophthalmol Vis Sci 56:6832-8|
|Gottlieb, Jacqueline; Hayhoe, Mary; Hikosaka, Okihide et al. (2014) Attention, reward, and information seeking. J Neurosci 34:15497-504|
|Hayhoe, Mary; Ballard, Dana (2014) Modeling task control of eye movements. Curr Biol 24:R622-8|
|Johnson, Leif; Sullivan, Brian; Hayhoe, Mary et al. (2014) Predicting human visuomotor behaviour in a driving task. Philos Trans R Soc Lond B Biol Sci 369:20130044|
|Kit, Dmitry; Katz, Leor; Sullivan, Brian et al. (2014) Eye movements, visual search and scene memory, in an immersive virtual environment. PLoS One 9:e94362|
|Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry et al. (2013) Real-time recording and classification of eye movements in an immersive virtual environment. J Vis 13:|
|Delerue, Celine; Hayhoe, Mary; Boucart, Muriel (2013) Eye movements during natural actions in patients with schizophrenia. J Psychiatry Neurosci 38:317-24|
|Iorizzo, Dana B; Riley, Meghan E; Hayhoe, Mary et al. (2011) Differential impact of partial cortical blindness on gaze strategies when sitting and walking - an immersive virtual reality study. Vision Res 51:1173-84|
|Tatler, Benjamin W; Hayhoe, Mary M; Land, Michael F et al. (2011) Eye guidance in natural vision: reinterpreting salience. J Vis 11:5|
Showing the most recent 10 out of 16 publications