This research addresses the neural bases of speech perception in noisy environments. With its singular role in communication, speech is perhaps the most important everyday stimulus for human beings. Yet rarely does speech occur under pristine conditions;competing voices, reverberations, and other environmental sounds typically corrupt the signal. This poses a continual challenge for normal listeners and especially for those with hearing loss. Among the 30 million Americans with hearing loss, many suffer depression and social isolation because of their difficulty communicating. In the half-century since the original formulation of the """"""""cocktail party effect"""""""", scientists have established three key perceptual/cognitive factors that improve speech intelligibility in a competing background: acoustic cues, audiovisual integration (voice + mouth movements), and linguistic context. However, little is known about how these mechanisms are implemented in the brain, particularly at the level of large-scale functional neural networks. The proposed research uses functional magnetic resonance imaging (fMRI) integrated with psychophysics to address the three factors that determine intelligibility. Innovative neural network analyses test how interactions among brain regions accommodate degraded speech and improve comprehension.
Our specific AIMS are to identify the neural networks mediating speech perception in noise, when intelligibility depends on: 1) Acoustic Cues, 2) Audiovisual Integration, and 3) Linguistic Context. This research program comprises a multipronged and highly cohesive body of work that will help secure our understanding of speech perception to its neurobiological foundations. Relevance to public health: We study how our brains understand speech in a noisy background, such as at a restaurant, ballgame, or office. Research like this may someday help to design better hearing aids and similar devices. It may also result in more effective listening strategies, both for those with healthy hearing and especially for those with hearing loss.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
3R01DC008171-04S1
Application #
7850247
Study Section
Cognitive Neuroscience Study Section (COG)
Program Officer
Miller, Roger
Project Start
2009-07-17
Project End
2011-03-31
Budget Start
2009-07-17
Budget End
2011-03-31
Support Year
4
Fiscal Year
2009
Total Cost
$164,960
Indirect Cost
Name
University of California Davis
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
047120084
City
Davis
State
CA
Country
United States
Zip Code
95618
Bishop, Christopher W; Yadav, Deepak; London, Sam et al. (2014) The effects of preceding lead-alone and lag-alone click trains on the buildup of echo suppression. J Acoust Soc Am 136:803-17
Da Costa, Sandra; van der Zwaag, Wietske; Miller, Lee M et al. (2013) Tuning in to sound: frequency-selective attentional filter in human primary auditory cortex. J Neurosci 33:1858-63
Miller, Lee M (2013) Shaken, not stirred: emergence of neural selectivity in a ""cocktail party"". Neuron 77:806-9
Shahin, Antoine J; Kerlin, Jess R; Bhat, Jyoti et al. (2012) Neural restoration of degraded audiovisual speech. Neuroimage 60:530-8
London, Sam; Bishop, Christopher W; Miller, Lee M (2012) Spatial attention modulates the precedence effect. J Exp Psychol Hum Percept Perform 38:1371-9
Campbell, Tom; Kerlin, Jess R; Bishop, Christopher W et al. (2012) Methods to eliminate stimulus transduction artifact from insert earphones during electroencephalography. Ear Hear 33:144-50
Bishop, Christopher W; London, Sam; Miller, Lee M (2012) Neural time course of visually enhanced echo suppression. J Neurophysiol 108:1869-83
Hill, Kevin T; Bishop, Christopher W; Miller, Lee M (2012) Auditory grouping mechanisms reflect a sound's relative position in a sequence. Front Hum Neurosci 6:158
Hill, Kevin T; Bishop, Christopher W; Yadav, Deepak et al. (2011) Pattern of BOLD signal in auditory cortex relates acoustic response to perceptual streaming. BMC Neurosci 12:85
Bishop, Christopher W; London, Sam; Miller, Lee M (2011) Visual influences on echo suppression. Curr Biol 21:221-5

Showing the most recent 10 out of 21 publications