People face complex mobility challenges in natural settings every day, when walking down a busy sidewalk, through a crowded train station, or in a shopping mall. To guide locomotion, the visual system detects information about self-motion through an evolving layout of objects and other pedestrians, and generates a safe and efficient path of travel. Individuals with low vision report mobility as one of the most difficult activities of daily living, particularly walking in crowds or using public transportation, with increased risks of collision, injury, and reduced independence. As yet, however, researchers do not understand how vision is used to control locomotor behavior in such complex, everyday settings. The long-term objective of the proposed project is to develop the first vision-based model of pedestrian behavior in dynamic, crowded environments, and use the results to design more effective assistive technology. Most models of locomotor control (from robotics, computer animation, and biology) assume the 3D positions and velocities of environmental objects as input, and plan a collision-free path according to objective criteria. A vision-based model would take the optical information available to a pedestrian and generate human-like paths of locomotion, based on experimental data. The first specific aim is thus to determine the effective visual information that guides walking with a crowd. Specifically, we will test the hypotheses that (a) optic flow, (b) segmented 2D motion, or (c) perceived 3D motion, is used follow multiple neighbors, and how this information is spatially and temporally integrated. The second specific aim is to determine the visual control laws that regulate walking speed and direction in a crowd. Specifically, we will test competing models of collision avoidance, following, and overtaking, and formalize a vision-based pedestrian model. Based on these results, the third specific aim is to evaluate alternative approaches to sensory substitution for locomotor guidance. Specifically, we will compare coding schemes for a vibrotactile belt based on recoding the effective optical variables in tactile patterns, or using the vision-based model to steer the user with directional cuing. Behavioral experiments will test the optical variables and control laws that govern locomotion in crowds, by manipulating visual displays during walking in an immersive virtual environment (12m x 14m). Agent-based simulations will compare competing models of the experimental data and previously collected crowd data. This methodology will enable us to test alternative hypotheses about visual information and visual control laws, and create an experimentally-grounded vision-based pedestrian model. Sensory substitution experiments will test normally-sighted participants in matched visual and tactile virtual environments; if the results are promising, tests with low-vision and blind participants will be pursued in subsequent applications. The research will contribute to basic knowledge about visually-guided locomotion in complex, dynamic environments, and apply it to the design of an assistive mobility device.

Public Health Relevance

This project will study how vision guides walking in complex everyday situations like a crowded shopping mall or a busy train station, which are particularly challenging for people with low vision. In the short run, the research will contribute to basic scientific knowledge about visually-guided behavior in dynamic, crowded environments. In the longer term, it will lead to better assistive technology, such as tactile mobility aids and self-guiding wheelchairs, which aims to improve the quality of life and independence of people who are partially sighted or blind.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY029745-02
Application #
9873968
Study Section
Cognition and Perception Study Section (CP)
Program Officer
Wiggs, Cheri
Project Start
2019-03-01
Project End
2024-02-29
Budget Start
2020-03-01
Budget End
2021-02-28
Support Year
2
Fiscal Year
2020
Total Cost
Indirect Cost
Name
Brown University
Department
Social Sciences
Type
Schools of Arts and Sciences
DUNS #
001785542
City
Providence
State
RI
Country
United States
Zip Code
02912