People with hearing loss might benefit greatly if they could use the visual signal to recognize speech. Sadly, as people age, they often experience a decline in both their hearing ability and their lipreading ability, so that at the very time their audiovisual speech perception could maximally benefit from visual speech information, they are least able to utilize it. Understanding the linkage between these phenomena has been a longstanding challenge for both cognitive science and aural rehabilitation, yet we believe this linkage provides a major clue to understanding both individual and age-related differences in everyday speech perception. Most researchers assume that there is a distinct integration stage and that integrative ability diminishes with age. Our most recent findings, however, suggest that differences in unimodal performance, and not differences in integrative abilities, underlie both individual and age-related differences in audiovisual speech perception. Accordingly, the proposed research takes the unique approach of focusing on both the causes of individual differences in lipreading skill and on lipreading as a cause of differences in audiovisual speech perception.
For Specific Aim 1, we will use our new feature analysis methods to predict audiovisual speech perception based on participants' unimodal (i.e., auditory-only and vision-only) performance. We will evaluate the relative contributions of cognitive, perceptual, and speech production abilities as well as gaze behavior to both individual and age-related differences in vision-only and audiovisual speech perception.
For Specific Aim 2, we will use a modification of our speech detection and lipread- yourself tasks, along with fMRI, to assess differences in the correspondence between the phonetic representations that support speech production and perception. Finally, for Aim 3, we will identify neural correlates of vision-only and audiovisual speech perception using fMRI. The results from Aims 1 and 2 will help us identify the regions of interest. We will focus on premotor areas because we hypothesize that the linkage between speech production and visual speech perception is critical for understanding how people recognize visual speech signals, and we will directly compare the activation in premotor cortex during both speech production and perception using representational similarity analysis, an elegant multi-variate analytic technique. By the end of the grant cycle, our goal is to have developed a unified neurocognitive framework for understanding both individual and age-related differences in lipreading and audiovisual speech perception. This framework and our experimental results will also provide evidence-based guidance for both speech perception training and aural rehabilitation counselling.

Public Health Relevance

When auditory speech signals are degraded by hearing loss or background noise, people can benefit greatly from the addition of the corresponding visual speech signals. The proposed research addresses foundational issues in audiovisual speech perception, focusing on the roles played by individual and age-related differences in lipreading ability. Using both behavioral measures and functional MRI, we will systematically evaluate the contributions of phonetic and lexical processing, peripheral changes (e.g., changes in hearing ability and contrast sensitivity), gaze patterns, cognitive skills (e.g., verbal and visuospatial processing speed), and examine the distinctiveness of neural representations of both speech production and perception in order to develop a unified neurocognitive account of the audiovisual speech advantage.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
5R01DC016594-02
Application #
9723073
Study Section
Language and Communication Study Section (LCOM)
Program Officer
King, Kelly Anne
Project Start
2018-07-01
Project End
2023-06-30
Budget Start
2019-07-01
Budget End
2020-06-30
Support Year
2
Fiscal Year
2019
Total Cost
Indirect Cost
Name
Washington University
Department
Otolaryngology
Type
Schools of Medicine
DUNS #
068552207
City
Saint Louis
State
MO
Country
United States
Zip Code
63130