? PROJECT 4 A major issue in hearing loss is variability. Hearing impaired (HI) listeners with similar profiles often show different outcomes. Correlational studies show that signal quality (audibility, frequency separation) is related to outcomes. However, equally important are factors like device experience, cognition and brain function. It is unclear how these adaptations, cognitive resources, or brain areas improve perception. This project tackles this by leveraging mechanisms and measures from cognitive science that describe how sound is mapped to meaning, focusing on the issue of time. Since speech unfolds over time, there are ambiguous periods when the input is compatible with many words. For example, at the onset of butter, the signal could match bump, but and buck. Normal hearing (NH) listeners manage this ambiguity by immediately activating multiple words which compete dynamically over time. For HI listeners, this natural ambiguity may be more problematic and managed differently. We assess the dynamics of word recognition with an eye-tracking paradigm that traces how this competition unfolds over several hundred milliseconds. Prior work suggests cochlear implant (CI) users tune these dynamics differently than NH listeners; these differences are correlated with outcomes and may help cope with poor input. This project asks why these competition processes differ in HI listeners. Are such differences a poor version of typical language processing imposed by degraded input? Or are they a compensatory adaptation for coping with uncertainty? To answer this question in a way that translates to the real-world, Aim 1 moves beyond isolated words to examine sentences, where factors like semantics constrain this competition.
Aim 2 uses a longitudinal study to link differences in competition to peripheral auditory function (Project 2), listening effort (Project 1) and cortical processing (Project 3);
and Aim 3 complements this with laboratory studies of adaption.
Aim 4 examines how HI listeners fuse information from different types of input, for example, from aided acoustic hearing and a CI.
All aims leverage natural variation in multiple types of HI listeners (standard CIs, acoustic+electric CI configurations, and hearing aids) to investigate how differences in the peripheral input impact the mechanisms of language processing.
? PROJECT 4 We seek a better understanding of the cognitive mechanisms by which hearing impaired listeners (including cochlear implant and hearing aid users) map the auditory signal to meaning, and how these mechanisms adapt to compensate for poor input. This will improve cochlear implantation criteria, outcome measures, post- remediation therapies, and signal processing strategies in cochlear implants and hearing aids.
Showing the most recent 10 out of 247 publications