Human beings confront pervasive ambiguity in language as they listen and read. For example, two spoken words can sound nearly identical (e.g., “beach†and “peachâ€); sentences can be grammatically analyzed in multiple ways; and speech is often obscured by environmental sounds, as in a crowded restaurant. Although ambiguity routinely increases the risk of arriving at the wrong interpretation, or even no interpretation of a message, healthy young adults are marvelously adept at understanding language, vastly surpassing the capabilities of automatic systems despite major advances in machine learning. Explaining how humans understand language in the face of ambiguity, and why some populations fail, is a fundamental challenge for cognitive science, and is crucial for assisting people with language impairments that intensify under conditions of ambiguity, including stroke patients and healthy older adults with hearing loss. This research will use advanced neurophysiological and eye-tracking methods to illuminate how people regulate their interpretations of speech and language input by using executive functions (EF): the family of cognitive mechanisms that enable information processing in the brain and the flexible guidance of goal-directed behavior. EF are thought to be critical to helping people select among different interpretations of a sentence when more than one is possible, and for guiding the mind toward perceiving a word when the environment is “noisy.†However, the neural and cognitive mechanisms that allow EF to contribute to language comprehension are poorly understood. This project will elucidate EF’s role in language comprehension.
This project will test the hypothesis that two particular forms of executive function – attention control and cognitive control – play fundamental and distinct roles in supporting language comprehension in the face of ambiguity. Attention control is hypothesized to regulate the collection of information by directing perceptual processing toward the intended signal, isolating it from noise under suboptimal listening conditions. Cognitive control is hypothesized to assist the revision of misinterpretations by guiding internal representations to align with relevant sources of evidence when multiple cues compete. The investigators will use neural oscillatory activity in scalp-recorded EEG to provide novel markers of cognitive control and attention control during cognitive task performance. Eye-movement patterns will be used to track participants’ interpretations of language input in real time, by recording the millisecond-level dynamics of listeners’ gaze while they listen to sentences and interrogate a scene before them. The combination of EEG and eye tracking will provide a multi-modal characterization of the neural processes and temporal nature of interpretation procedures during language comprehension. In order to illuminate how attention and cognitive control separately affect humans’ ability to deal with ambiguity, the experiments will manipulate the engagement status of these EF subtypes just prior to processing language, by having participants perform tasks, such as the Flanker task, which are known to recruit cognitive-control processes. This experimental approach will establish causal relationships between EF and language processing and will identify the distinctive contributions of cognitive control and attention control to comprehension.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.