The project consists of applied and basic research, with a decided focus on the latter. On the applied side, the team will continue refining the test-bed navigation system for the blind developed during the last four years. The system guides a blind person through an outdoor environment and provides information about prominent landmarks and environmental features. A differentially-corrected GPS receiver worn by the traveler is used to determine the person's longitude and latitude, the values of which are communicated to the computer with a spatial database containing information about environmental landmarks. A virtual acoustic display indicates the positions of environmental features and landmarks by having their labels, spoken by speech synthesizer, appear as sounds at the appropriate locations within the auditory space of the traveler. Experimental research includes an experiment comparing spatialized sound with non-spatialized synthesized speech in fairly realistic settings. Their basic research is relevant to long-term development of an effective navigation system, but focuses on underlying non-visual spatial processes. There are 4 basic research topics: auditory space perception, path integration, the learning of spatial layout, and the learning of route configurations by """"""""preview"""""""". In connection with auditory space perception, they will conduct a systematic study of the factors influencing the extracranial localization of earphone sound and another study to determine whether the perceived locations of auditory targets fully determine the perceived interval between them. In connection with path integration (a form of navigation in which self-motion is integrated to determine current position and orientation), they will address the effects on path integration of homing to spatialized sound vs. passive guidance (by way of the sighted guide technique) and the scale of the path. In connection with the learning of spatial layout, they will conduct experiments with repeated traversal of a path. The studies gradually increase the complexity of the subject's task, starting with perceiving and remembering the location of a single landmark while traversing a straight path and ending with learning the spatial layout of several off-route landmarks while repeatedly traversing a square path. In these tasks they will compare the relative effectiveness of spatialized sound and non-spatialized speech for conveying the locations of the landmarks (relative to the subject's current location). They also investigate whether, if a path is repeatedly explored in the same direction, the learned representation is orientation-specific. The experiments on spatial learning by preview compare the learning of a route by walking vs. auditory or haptic exposure.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY009740-07
Application #
6384350
Study Section
Special Emphasis Panel (ZRG1-VISB (05))
Program Officer
Oberdorfer, Michael
Project Start
1997-12-01
Project End
2003-05-31
Budget Start
2001-06-01
Budget End
2002-05-31
Support Year
7
Fiscal Year
2001
Total Cost
$145,262
Indirect Cost
Name
University of California Santa Barbara
Department
Type
Organized Research Units
DUNS #
City
Santa Barbara
State
CA
Country
United States
Zip Code
93106
Avraamides, Marios N; Loomis, Jack M; Klatzky, Roberta L et al. (2004) Functional equivalence of spatial representations derived from vision and language: evidence from allocentric judgments. J Exp Psychol Learn Mem Cogn 30:804-14
Klatzky, Roberta L; Lippa, Yvonne; Loomis, Jack M et al. (2003) Encoding, learning, and spatial updating of multiple object locations specified by 3-D sound, spatial language, and vision. Exp Brain Res 149:48-61
Klatzky, Roberta L; Lederman, Susan J (2003) Representing spatial location and layout from sparse kinesthetic contacts. J Exp Psychol Hum Percept Perform 29:310-25
Loomis, Jack M; Lippa, Yvonne; Golledge, Reginald G et al. (2002) Spatial updating of locations specified by 3-d sound and spatial language. J Exp Psychol Learn Mem Cogn 28:335-45
Klatzky, Roberta L; Lippa, Yvonne; Loomis, Jack M et al. (2002) Learning directions of objects specified by vision, spatial audition, or auditory spatial language. Learn Mem 9:364-7