The long-term objective of this project is to develop a dynamical model of visually controlled locomotor be- havior, which can be used to simulate visual mobility problems and predict paths of locomotion in everyday environments. The research pursues two closely related problems: First, the behavioral dynamics of locomotion, or how safe paths of locomotion are generated through a complex changing environment;second, the control laws for locomotion, or how this behavior is guided by perceptual control strategies based on multisensory in- formation. In previous research, we developed a dynamical model of four elementary locomotor behaviors: (a) steering to a stationary goal, (b) avoiding a stationary obstacle, (c) intercepting a moving target, and (d) avoiding a moving obstacle. Using virtual reality techniques in freely-walking participants, we empirically tested each component, and then studied how the elementary behaviors are integrated in more complex situations. We found that linear combinations of the components successfully predict human paths with various pairs of goals and obstacles, such as a moving target with a stationary obstacle, and even with pairs of participants in a pursuit/evasion task;however, attentional factors appear to play a role with multiple obstacles. In a second set of studies, we tested the contributions of optic flow, egocentric direction, and gaze/head/body alignment to online steering control and visual-locomotor adaptation.
The specific aims of the proposed research are to pursue locomotor interactions between pedestrians, map out multisensory contributions to perceptual control strategies, and apply the model to simulate visual im- pairments and mobility problems. We will manipulate visual information during walking using virtual envi- ronments presented in a head-mounted display, podokinetic information by locomotor adaptation, vestibular information by galvanic stimulation, and neck proprioception by muscle vibration. Four studies are proposed: (1) Pedestrian interactions will investigate locomotor interactions between pedestrians, including a pair of par- ticipants, a participant interacting with a simulated virtual agent, a participant interacting with a crowd of vir- tual agents, and naturalistic studies of pedestrian traffic flow. (2) Multisensory control strategies will test the contributions of visual, podokinetic, vestibular, and proprioceptive information to steering control. (3) Com- plex behaviors will extend the model to incorporate attentional factors, speed control, and address the question of "online" control or advance path planning. (4) Modeling peripheral field loss will empirically test and seek to model the effects of tunnel vision and homonymous hemianopsia on mobility. 14

Public Health Relevance

The results will contribute to basic knowledge about the visual control of locomotion and apply it to clinical understanding of visual mobility problems in disease and aging. A working model of locomotor behavior would allow us to predict individual paths of locomotion or pedestrian traffic in everyday situations, simulate specific visual impairments and mobility problems, inform architectural planning to facilitate mobility, and evaluate potential remediations. 14

National Institute of Health (NIH)
National Eye Institute (NEI)
Research Project (R01)
Project #
Application #
Study Section
Cognition and Perception Study Section (CP)
Program Officer
Wiggs, Cheri
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
Brown University
Schools of Arts and Sciences
United States
Zip Code
Zhao, Huaiyong; Warren, William H (2015) On-line and model-based approaches to the visual control of action. Vision Res 110:190-202
Rio, Kevin W; Rhea, Christopher K; Warren, William H (2014) Follow the leader: visual control of speed in pedestrian following. J Vis 14:
Gerin-Lajoie, Martin; Ciombor, Deborah McK; Warren, William H et al. (2010) Using ambulatory virtual environments for the assessment of functional gait impairment: a proof-of-concept study. Gait Posture 31:533-6
Bruggeman, Hugo; Warren, William H (2010) The direction of walking--but not throwing or kicking--is adapted by optic flow. Psychol Sci 21:1006-13
Fink, Philip W; Foo, Patrick S; Warren, William H (2009) Catching fly balls in virtual reality: a critical test of the outfielder problem. J Vis 9:14.1-8
Bosworth, Rain G; Dobkins, Karen R (2009) Chromatic and luminance contrast sensitivity in fullterm and preterm infants. J Vis 9:15.1-16
Warren, William H (2009) How do animals get about by vision? Visually controlled locomotion and orientation after 50 years. Br J Psychol 100:277-81
Bruggeman, Hugo; Zosh, Wendy; Warren, William H (2007) Optic flow drives human visuo-locomotor adaptation. Curr Biol 17:2035-40
Fajen, Brett R; Warren, William H (2007) Behavioral dynamics of intercepting a moving target. Exp Brain Res 180:303-19
Warren, William H (2006) The dynamics of perception and action. Psychol Rev 113:358-89

Showing the most recent 10 out of 20 publications