Speech perception is one of the most important cognitive functions of the brain. Humans use both the auditory information available in the heard speech and the visual information available from viewing the speaker's face to understand speech. Functional magnetic resonance imaging (fMRI) is the most popular method for examining human brain function, but suffers from the critical limitation of poor temporal resolution: because it measures hemodynamic changes, it has only ~sec resolution, rather than the ~ms timescale of neuronal activity. This renders fMRI a poor choice for examining dynamic neuronal processing. Instead, we propose to use electrocorticography (eCog), in which electrodes implanted in human subjects for the treatment of epilepsy are used to directly measure neuronal activity in the human brain. The major innovation of this proposal will be to use eCog to measure the dynamic neural processes underlying audiovisual speech perception. The central hypothesis is that rapid communication occurs between the human superior temporal sulcus (STS) and sensory areas through neural oscillations, specifically frequency-specific interactions between the multisensory cortex in the STS and earlier sensory areas in the auditory and visual cortex. As a measure of functional connectivity between areas in the speech network, we will measure trial-by-trial correlations in gamma-band power between brain areas. We assume that high correlations in gamma- band power reflect information transfer and a functional connection between the areas. Neuronal oscillations, particularly in the gamma range (~ 30 - 200 Hz), have been found to reflect neuronal spiking activity and several studies suggest that neuronal oscillations might be an important mechanism of information transfer in the brain that modulates spiking activity. We will test the hypothesis that gamma-band activity in the STS correlates with gamma-band activity in auditory association areas during multisensory speech perception. In ordinary speech, the auditory speech signal is much more informative than the visual speech signal. Therefore, we hypothesize that for clear audiovisual speech, the correlation in gamma oscillations between auditory cortex and STS is stronger than the correlation in gamma oscillations between visual cortex and STS. Conversely, we expect a stronger correlation between gamma-band activity in the visual cortex and the STS when the auditory speech signal is noisy. The proposed research will make use of direct neural recordings from the human brain to investigate whether the information content of the different speech signals affects the communication between early sensory areas and multisensory associative areas. These findings will help us understand the neuronal dynamics involved in speech perception and devise new therapies to help treat veterans with hearing loss and other disabilities.

Public Health Relevance

Many U.S. veterans with acquired hearing loss suffer from a disabling loss in the ability to process and understand speech. Multisensory integration through the use of the visual information in the talker's face can help compensate for these auditory deficits. However, little i known about the neural mechanisms for multisensory speech perception. We propose to remedy this knowledge gap using electrocorticography, which is ideally suited to investigate the dynamics of speech perception at high temporal (millisecond) and spatial (millimeter) resolution. A better understand of the neural mechanisms for multisensory speech perception will allow us to design therapies to promote multisensory integration in patients with sensory dysfunction, especially hearing loss, and other language impairments, such as stroke.

Agency
National Institute of Health (NIH)
Institute
Veterans Affairs (VA)
Type
Non-HHS Research Projects (I01)
Project #
5I01CX001122-03
Application #
9337251
Study Section
Neurobiology B (NURB)
Project Start
2015-04-01
Project End
2018-03-31
Budget Start
2017-04-01
Budget End
2018-03-31
Support Year
3
Fiscal Year
2017
Total Cost
Indirect Cost
Name
Michael E Debakey VA Medical Center
Department
Type
DUNS #
078446044
City
Houston
State
TX
Country
United States
Zip Code
77030
Magnotti, John F; Beauchamp, Michael S (2018) Published estimates of group differences in multisensory integration are inflated. PLoS One 13:e0202908
Ozker, Muge; Yoshor, Daniel; Beauchamp, Michael S (2018) Converging Evidence From Electrocorticography and BOLD fMRI for a Sharp Functional Boundary in Superior Temporal Gyrus Related to Multisensory Speech Processing. Front Hum Neurosci 12:141
Ozker, Muge; Yoshor, Daniel; Beauchamp, Michael S (2018) Frontal cortex selects representations of the talker's mouth to aid in speech perception. Elife 7:
Ozker, Muge; Schepers, Inga M; Magnotti, John F et al. (2017) A Double Dissociation between Anterior and Posterior Superior Temporal Gyrus for Processing Audiovisual Speech Demonstrated by Electrocorticography. J Cogn Neurosci 29:1044-1060
Bosking, William H; Beauchamp, Michael S; Yoshor, Daniel (2017) Electrical Stimulation of Visual Cortex: Relevance for the Development of Visual Cortical Prosthetics. Annu Rev Vis Sci 3:141-166
Schepers, Inga M; Yoshor, Daniel; Beauchamp, Michael S (2015) Electrocorticography Reveals Enhanced Visual Cortex Responses to Visual Speech. Cereb Cortex 25:4103-10