The ability to focus attention on elements of our sensory environment is a critical cognitive function that enables one to enhance the processing of high priority stimuli. A classic auditory example is the "cocktail party effect," in which a person can focus selectively on a particular speaker while tuning out other conversations. In vision, attending to a particular region of the visual field results in faster and better discrimination of stimuli in that region. In recent years, both electrophysiological and functional brain imaging studies have suggested that a network of frontal and parietal brain areas enables selective attention by enhancing the responses in sensory cortices in favor of task-relevant stimuli. Almost all studies of attention, however, have been conducted within a single sensory modality. The real world is multisensory, numerous real objects have multisensory characteristics (e.g., both auditory and visual aspects) that need to be attended, perceived, and integrated. With funding from the National Science Foundation, Dr. Marty Woldorff is combining electrical and functional imaging measures of brain activity to study the mechanisms by which attention operates in a multisensory world. This includes the study of (1) whether there are separate attentional resources for processing stimuli in different sensory modalities, (2) how attention influences multisensory integration processes, and (3) how attention may spread from one sensory modality to another in a multisensory object. Recording both electrical and functional imaging measures of brain activity during the performance of multisensory attentional tasks will reveal the location, timing, and sequence of the brain mechanisms underlying multisensory attentional processes.

The broader impacts of this project relate to the fact that the real world is multisensory, but the mechanisms by which attention operates in multisensory circumstances are far from understood. As just one example, every day millions of motorists drive on streets and highways in the multisensory environment of a car, in which they must cope with a myriad of visual and auditory sensory inputs, including police sirens, car horns, radios, talking passengers and misbehaving children. Understanding processing limitations and how attention facilitates performance by influencing multisensory interactions is of fundamental importance for understanding the performance of such everyday real-life activities. Moreover, understanding these mechanisms could have a large, practical impact on matters ranging from the training of drivers to the design of cars and cockpits. The present work also has impact for understanding the mechanisms by which people integrate the auditory content of speech with the visual input of mouth and head movements, a function fundamental for human beings everywhere. Lastly, gaining basic scientific understanding concerning how attention operates in multisensory circumstances has impact not only for individuals with normal perceptual and attentional capabilities, but also for individuals in whom such capabilities have been impaired through injury or other causes.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Application #
0524031
Program Officer
Lynne Bernstein
Project Start
Project End
Budget Start
2005-09-01
Budget End
2010-08-31
Support Year
Fiscal Year
2005
Total Cost
$329,896
Indirect Cost
Name
Duke University
Department
Type
DUNS #
City
Durham
State
NC
Country
United States
Zip Code
27705