In the natural environment, the brain is often confronted with the daunting task of interpreting auditory signals that occur in the presence of noise, which can render important auditory events ambiguous and less salient. However, in naturalistic circumstances these auditory cues are typically accompanied by visual information, often from the same events. The presence of such coincident audiovisual cues can greatly amplify the salience of a stimulus of interest. However, although a number of studies have illustrated the behavioral and perceptual benefits of having multisensory (e.g., audiovisual) cues available, and a growing literature on the neural encoding processes that characterize multisensory interactions, very few studies have been able to link multisensory neural changes to their behavioral and perceptual correlates. To more firmly establish these links, we will train rhesus monkeys to detect or localize a target sound (signal), and ignore ongoing or simultaneously occurring non-target sounds (noise). The spatial and temporal relationships between the signal and noise will be varied, to evaluate their effects on how well the monkey detects or localizes the signal. Similar experiments will be performed with the addition of visual stimuli, whose location and/or timing will be varied such that it sometimes matches that of the signal, and sometimes that of the noise. This will allow us to assess the effects of non-auditory (visual) stimuli on auditory behavioral performance and allow us to evaluate brain mechanisms of multisensory integration. Neurophysiological recordings of neurons in the inferior and superior colliculi are expected to reveal their differential role in auditory and audiovisual detection and localization behaviors. The translational and clinical relevance of this work is very high for the hearing impaired and the elderly, in that auditory assistive devices often perform poorly in noisy environments. Greater knowledge of how the brain processes auditory signals within noise, and how visual information can enhance neural and behavior performance in complex environments, will be of great utility for the design of better technologies to deal with hearing loss and its profound impact on quality of life.
The proposed research seeks to better characterize auditory and audiovisual detection and localization behaviors and their neural correlates in non-human primates. The work will fill an important void in our understanding of how midbrain structures (the inferior and superior colliculus [IC and SC]) encode auditory and audiovisual cues in the context of active behaviors, the differential contributions of the IC and SC to these behaviors, the gain enabled under multisensory (i.e., audiovisual) conditions, and how these processes change in noisy, natural environments where distractors exist. The data obtained will have important implications for deafness and hearing loss, and for clinical conditions (e.g., autism, schizophrenia) in which auditory and multisensory function is compromised.