Human communication, from classroom learning to friendly conversation, involves the integration of auditory and visual speech information, and understanding when and how this integration occurs in the brain is a fundamental question in communication science. Moreover, virtually nothing is known about how visual influences affect auditory processing of music. With support from the National Science Foundation, Dr. Nina Kraus will conduct three years of research investigating how visual information modulates auditory inputs to the brain. Of primary interest is the fact that seeing a talking face greatly improves the ability of a listener to understand speech. The question of when and where in the brain the auditory and visual speech information is integrated is poorly understood. Preliminary findings from the Kraus laboratory suggest that it occurs earlier in the neural processing network in the brain than previously thought, thus challenging prevailing views of sensory integration. These questions will be addressed by recording brain responses (EEG) in normal adult listeners at different levels of the central nervous system. Two experiments will be conducted. The first experiment will address visual influences on auditory processing of speech by comparing brain responses to auditory-only speech with responses elicited by audio-visual speech, provided by a talking face. In the second experiment, visual influences in music perception will be investigated by comparing brain responses elicited by a cello note to responses elicited by a cello note accompanied by a video of a musician bowing a cello. To assess the effect of experience on audiovisual interaction, the latter study will employ non-musicians and professional musicians as subjects: professionally trained musicians may use visual musical information, such as viewing the bowing of a cello, to enhance their perception of musical sounds in a manner similar to the enhanced perception of speech provided by seeing a talking face. This study will also investigate the extent to which speech and music are processed by different networks in the brain, which will provide important information about how the brain is programmed to process different types of acoustic signals. This work could have broad impacts. It will provide a better understanding of the sensory processes that are necessary for successful human communication, which may inform the development of improved techniques and practices of communication for schoolteachers and public speakers as well as for those learning foreign languages and those learning to play musical instruments. Furthermore, an understanding of how the normal auditory and visual systems interact in the brain will provide an essential baseline for future studies addressing audiovisual interactions in disabled populations, such as hearing and reading-impaired individuals, and could lead to improvements in the identification and remediation of these disabilities.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Application #
0544846
Program Officer
Lynne Bernstein
Project Start
Project End
Budget Start
2006-06-01
Budget End
2010-05-31
Support Year
Fiscal Year
2005
Total Cost
$536,067
Indirect Cost
Name
Northwestern University at Chicago
Department
Type
DUNS #
City
Evanston
State
IL
Country
United States
Zip Code
60201