Our normal everyday perception of 3-dimensional space and our abilities to interact and navigate within the space require integration of information from multiple sensory modalities. The integration of distance/depth information in the intermediate range (2 - 20 m), where visual and auditory modalities provide the primary inputs, is not well understood, however. The long-term goal of this project is a complete understanding of how auditory and visual information is integrated to form distance percepts that can support accurate orientation and navigation in both normal and sensory-impaired populations. The objective of this application is to test and refine an innovative conceptual framework that represents the integration processes in a normal-hearing, normal-vision population. The central hypothesis guiding this framework is that distance perception, unlike the perception of direction, requires additional contextual or background information about the environment that is beyond that provided by the object itself. This background representation can act like a frame of reference for coding distance. For multisensory distance input, object, contextual and background information must be integrated across modalities. But since all the information is not necessarily available at the same time, memory must be involved in the integration process. The rationale that underlies the proposed research is that once a conceptual framework for auditory/visual distance integration has been specified and validated for normal populations, new and innovative approaches can be applied to understanding and minimizing the impact of sensory impairments on spatial perception and navigation. This hypothesis will be tested by pursuing two specific aims: 1) Reveal an integrated auditory and visual reference frame for distance perception based on the environmental background. 2) Determine the role of working memory in auditory/visual distance perception.
These aims will be addressed by testing human distance judgment and navigation performance under conditions in which the contributions of the contextual information, background information or working memory are manipulated. Virtual and real stimulus manipulation techniques will allow for novel pairing of auditory and visual information that will be used to evaluate and refine the proposed framework. Development and validation of this framework will be a significant contribution because it will provide a better understanding of how humans are able to successfully integrate auditory and visual information to perform spatial tasks in the environment. Moreover, it will provide a vehicle for future studies to advance the field of multisensory space perception. The proposed research is relevant to public health because it will lead to a better understanding of how auditory or visual impairment affects multisensory space perception. Ultimately, this knowledge may inform the development of new strategies for assisting or enhancing degraded spatial information to improve orientation and navigation abilities in visually- and/or hearing-impaired populations.

Public Health Relevance

Space perception in the intermediate distance range (2-20 m) depends on both the auditory and visual systems, and plays a crucial role in everyday activities including navigation. The proposed psychophysical research investigates how auditory and visual spatial information is integrated for distance perception. The research outcomes will advance our scientific understanding of how humans perceive and interact in the world, and also improve non-invasive methods for assessing malfunctions of space perception in the hearing- and/or visually impaired populations.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Exploratory/Developmental Grants (R21)
Project #
1R21EY023767-01
Application #
8575577
Study Section
Special Emphasis Panel (ZRG1-BBBP-T (52))
Program Officer
Wiggs, Cheri
Project Start
2013-07-01
Project End
2015-06-30
Budget Start
2013-07-01
Budget End
2014-06-30
Support Year
1
Fiscal Year
2013
Total Cost
$285,611
Indirect Cost
$85,611
Name
University of Louisville
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
057588857
City
Louisville
State
KY
Country
United States
Zip Code
40292
Kim, Duck O; Zahorik, Pavel; Carney, Laurel H et al. (2015) Auditory distance coding in rabbit midbrain neurons and human perception: monaural amplitude modulation depth as a cue. J Neurosci 35:5360-72