Our normal everyday perception of 3-dimensional space and our abilities to interact and navigate within the space require integration of information from multiple sensory modalities. The integration of distance/depth information in the intermediate range (2 - 20 m), where visual and auditory modalities provide the primary inputs, is not well understood, however. The long-term goal of this project is a complete understanding of how auditory and visual information is integrated to form distance percepts that can support accurate orientation and navigation in both normal and sensory-impaired populations. The objective of this application is to test and refine an innovative conceptual framework that represents the integration processes in a normal-hearing, normal-vision population. The central hypothesis guiding this framework is that distance perception, unlike the perception of direction, requires additional contextual or background information about the environment that is beyond that provided by the object itself. This background representation can act like a frame of reference for coding distance. For multisensory distance input, object, contextual and background information must be integrated across modalities. But since all the information is not necessarily available at the same time, memory must be involved in the integration process. The rationale that underlies the proposed research is that once a conceptual framework for auditory/visual distance integration has been specified and validated for normal populations, new and innovative approaches can be applied to understanding and minimizing the impact of sensory impairments on spatial perception and navigation. This hypothesis will be tested by pursuing two specific aims: 1) Reveal an integrated auditory and visual reference frame for distance perception based on the environmental background. 2) Determine the role of working memory in auditory/visual distance perception.
These aims will be addressed by testing human distance judgment and navigation performance under conditions in which the contributions of the contextual information, background information or working memory are manipulated. Virtual and real stimulus manipulation techniques will allow for novel pairing of auditory and visual information that will be used to evaluate and refine the proposed framework. Development and validation of this framework will be a significant contribution because it will provide a better understanding of how humans are able to successfully integrate auditory and visual information to perform spatial tasks in the environment. Moreover, it will provide a vehicle for future studies to advance the field of multisensory space perception. The proposed research is relevant to public health because it will lead to a better understanding of how auditory or visual impairment affects multisensory space perception. Ultimately, this knowledge may inform the development of new strategies for assisting or enhancing degraded spatial information to improve orientation and navigation abilities in visually- and/or hearing-impaired populations.
Space perception in the intermediate distance range (2-20 m) depends on both the auditory and visual systems, and plays a crucial role in everyday activities including navigation. The proposed psychophysical research investigates how auditory and visual spatial information is integrated for distance perception. The research outcomes will advance our scientific understanding of how humans perceive and interact in the world, and also improve non-invasive methods for assessing malfunctions of space perception in the hearing- and/or visually impaired populations.