The integration of auditory and visual stimuli is crucial for recognizing objects by sight and sound, communicating effectively, and navigating through our complex world. While auditory and visual information are combined in many sites of the human brain, the frontal lobes have been identified as a region associated with memory and language, which depend on multisensory integration of complex auditory and visual stimuli. Previous studies in non-human primates have revealed revealed juxtaposed and overlapping areas of visual and auditory processing in the ventral prefrontal cortex, and recently we have observed neurons that respond to combined face and voca stimuli indicating a possible role in multimodal sensory integration.The goal of this project is to obtain a fundamental understanding of how the ventral prefrontal cortexprocesses complex auditory, visual and combined audio-visual stimuli which serve meaningful communication and object recognition. Our experiments will focus on the neurophysiological and anatomical analysis of the primate ventral prefrontal cortex.
In Aim 1 we will characterize the selectivity, specificity and organization of prefrontal single unit electrophysiological responses to face, vocalization and combined face-vocalization stimuli. Our studies will utilize both static (picture of face)and dynamic (short movie) natural stimuli combined with vocalizations.
In Aim 2 we will examine the cellular mechanisms which underlie sensory integration in a cross-modal memory task and the cellular changes which take place during the learning of audio-visual associations in this task. Finally, we will elucidate the neuronal circuitry underlying auditory, visual and multimodal responses in the frontal lobes by determining the afferent and efferent connectionsof auditory, visual and multimodal regions of the ventral prefrontal cortex. Since our studies are aimed at determining the prefrontal neuronal mechanisms underlying the perception of complex communication stimuli and their integration, our findings will have implications for understanding neurological disorders which affect communication, language and sensory integration. These disorders include schizophrenia and autism in which disturbance of prefrontal cortical function has been described.
|Plakke, B; Romanski, L M (2016) Neural circuits in auditory and audiovisual memory. Brain Res 1640:278-88|
|Hwang, Jaewon; Romanski, Lizabeth M (2015) Prefrontal neuronal responses during audiovisual mnemonic processing. J Neurosci 35:960-71|
|Plakke, Bethany; Hwang, Jaewon; Romanski, Lizabeth M (2015) Inactivation of Primate Prefrontal Cortex Impairs Auditory and Audiovisual Working Memory. J Neurosci 35:9666-75|
|Diehl, Maria M; Romanski, Lizabeth M (2014) Responses of prefrontal multisensory neurons to mismatching faces and vocalizations. J Neurosci 34:11233-43|
|Plakke, Bethany; Romanski, Lizabeth M (2014) Auditory connections and functions of prefrontal cortex. Front Neurosci 8:199|
|Plakke, Bethany; Diltz, Mark D; Romanski, Lizabeth M (2013) Coding of vocalizations by single neurons in ventrolateral prefrontal cortex. Hear Res 305:135-43|
|Romanski, Lizabeth M (2012) Integration of faces and vocalizations in ventral prefrontal cortex: implications for the evolution of audiovisual speech. Proc Natl Acad Sci U S A 109 Suppl 1:10717-24|
|Romanski, L M; Hwang, J (2012) Timing of audiovisual inputs to the prefrontal cortex and multisensory integration. Neuroscience 214:36-48|
|Romanski, L M; Diehl, M M (2011) Neurons responsive to face-view in the primate ventrolateral prefrontal cortex. Neuroscience 189:223-35|
|Romanski, Lizabeth M; Averbeck, Bruno B (2009) The primate cortical auditory system and neural representation of conspecific vocalizations. Annu Rev Neurosci 32:315-46|
Showing the most recent 10 out of 15 publications