Basic and applied research is described, the goal of which is to develop a microcomputer-based navigation aid for the visually impaired. The device will (a) inform a visually impaired traveler of his or her current location and orientation with respect to the environment being navigated, (b) provide information about the proximal surroundings, and (c) assist in route planning. Portable GPS (satellite based Global Positioning System) receivers will be used to determine the traveler's longitude and latitude, and a computerized database will contain information about environmental features at and around that location. A primary consideration in designing the system will be to promote independence on the part of the visually impaired traveler. Experiment and questionnaires will be used to evaluate alternatives in the design of the system. Among the design alternatives to be considered is the type of display. Experiments will compare navigation performance and user evaluations when using a virtual auditory display to those when using a display providing verbal instructions. The virtual display will indicate the positions of landmarks by having their labels, spoken by speech synthesizer, appear as sounds at the correct locations within the auditory space of the traveler. Basic experimental research will pursue two general issues, one concerning inherent limitations on navigation without sight and one investigating the implications --positive or negative -- of device-aided navigation for learning about complex environments.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY009740-02
Application #
3267105
Study Section
Special Emphasis Panel (SSS (B7))
Project Start
1992-08-01
Project End
1996-07-31
Budget Start
1993-08-01
Budget End
1994-07-31
Support Year
2
Fiscal Year
1993
Total Cost
Indirect Cost
Name
University of California Santa Barbara
Department
Type
Organized Research Units
DUNS #
City
Santa Barbara
State
CA
Country
United States
Zip Code
93106
Avraamides, Marios N; Loomis, Jack M; Klatzky, Roberta L et al. (2004) Functional equivalence of spatial representations derived from vision and language: evidence from allocentric judgments. J Exp Psychol Learn Mem Cogn 30:804-14
Klatzky, Roberta L; Lippa, Yvonne; Loomis, Jack M et al. (2003) Encoding, learning, and spatial updating of multiple object locations specified by 3-D sound, spatial language, and vision. Exp Brain Res 149:48-61
Klatzky, Roberta L; Lederman, Susan J (2003) Representing spatial location and layout from sparse kinesthetic contacts. J Exp Psychol Hum Percept Perform 29:310-25
Loomis, Jack M; Lippa, Yvonne; Golledge, Reginald G et al. (2002) Spatial updating of locations specified by 3-d sound and spatial language. J Exp Psychol Learn Mem Cogn 28:335-45
Klatzky, Roberta L; Lippa, Yvonne; Loomis, Jack M et al. (2002) Learning directions of objects specified by vision, spatial audition, or auditory spatial language. Learn Mem 9:364-7