AND ABSTRACT Speech perception is one of the most remarkable achievements of the human brain, but how the brain extracts meaning from a speech signal is still poorly understood. During speech perception, the speech we hear is influenced by sensory input from our eyes and skin. For example, viewing the face of a speaker improves speech intelligibility. Feeling the lips and jaws of a speaker alters the perception of heard speech. Amazingly, individuals can be trained perceive speech through their sense of touch in the absence of auditory input. There is a long history of research on the communication of speech through the somatosensory system, dating back to 1924. Remarkably, no previous study has investigated the neural mechanisms of speech perception in the somatosensory system. The goal of this study is to determine how the brain learns to perceive speech through the sense of touch. The knowledge gained will inform current theories of auditory and visual speech perception, as well as cross-modal plasticity. This project will also inform the design of tactile speech prostheses that can serve as alternatives for patients with auditory nerve damage who cannot receive cochlear implants. In this study, participants will be trained to perceive vibrotactile speech: tactile patterns generated from recordings of spoken syllables. Behavioral experiments combined with advanced electroencephalography (EEG) techniques will be used to test the following hypotheses: 1) the learning of vibrotactile stimuli derived from speech constitutes genuine speech perception, and 2) the superior temporal gyrus, a key region involved in auditory speech perception, will acquire selectivity for vibrotactile speech after training. One hallmark of speech perception is the ability to recombine speech units to accurately perceive new stimuli. Therefore, participants will be tested on their ability to successfully generalize to untrained speech stimuli. Multisensory integration is also an important feature of speech perception: visual and auditory speech are effortlessly integrated in the brain to enhance perception. An enhancement of visual speech perception by simultaneously presented vibrotactile speech would provide further evidence that vibrotactile stimuli are learned as speech. To test the hypothesis that visual and vibrotactile speech is perceptually integrated, participants will be tested on their ability to identify visual-only and visual- vibrotactile speech. To determine how vibrotactile speech is learned and represented by the brain, multivariate pattern analysis (MVPA) and representation similarity analysis (RSA) will be used. MVPA and RSA are statistical techniques used in cognitive neuroscience to test hypotheses about how physical stimuli are represented in the brain. Pilot data demonstrates the feasibility of successfully decoding auditory speech and vibrotactile stimuli from EEG data using MVPA and RSA. These techniques will be used to test the hypothesis that vibrotactile speech is encoded in the superior temporal gyrus. Overall, this project will provide insights into speech learning in the somatosensory system, addressing the fundamental question about how speech perception is accomplished by the human brain.

Public Health Relevance

Speech perception is one of the most remarkable achievements of the human brain, but how the brain extracts meaning from a speech signal is still poorly understood. This study aims to determine how the brain learns to perceive speech through the sense of touch to provide insight into the neural mechanisms of speech perception. The knowledge gained will also inform the design of tactile speech prostheses that can serve as alternatives for patients with auditory nerve damage who cannot receive cochlear implants.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Individual Predoctoral NRSA for M.D./Ph.D. Fellowships (ADAMHA) (F30)
Project #
5F30DC016496-04
Application #
9966945
Study Section
Special Emphasis Panel (ZDC1)
Program Officer
Rivera-Rentas, Alberto L
Project Start
2017-07-01
Project End
2021-06-30
Budget Start
2020-07-01
Budget End
2021-06-30
Support Year
4
Fiscal Year
2020
Total Cost
Indirect Cost
Name
Georgetown University
Department
Neurosciences
Type
Schools of Medicine
DUNS #
049515844
City
Washington
State
DC
Country
United States
Zip Code
20057