The integration of facial gestures and vocal signals is an essential process in communication and involves many brain regions. One brain region that has been shown to play a crucial role in the processing of face and vocal information is the ventral frontal lobe, home to our language processing areas. While previous studies have described the role of the frontal lobes in working memory, decision-making, and goal-directed behavior with regard to visual information few have examined the cellular basis of auditory and integrative processes in the primate prefrontal cortex. The goal of the experiments in this proposal is to understand how neurons in the ventral prefrontal cortex process and integrate complex auditory and visual information, especially face and vocalization information that is relevant to social communication. We have previously shown that single neurons which respond to faces, to vocalizations, and to their integration are found in the ventral lateral prefrontal cotex (VLPFC), a region homologous to language regions in human inferior frontal gyrus. However, vocalization responsive neurons, which also respond to faces, are mainly found in anterior VLPFC. This suggests that portions of VLPFC may be specialized to process and integrate social stimuli, such as faces and vocalizations, or speech and gestures in the human brain. We will assess neuronal activity in primate prefrontal cortex during audiovisual memory tasks which will determine: 1) Is vocalization processing a multisensory process;2) What factors contribute to multisensory processing and integration of social versus non-social stimuli;and 3) Is the ventral prefrontal cortex necessary in remembering a face-vocalization stimulus? If we can understand the neural processes that normally occur when a facial gesture is integrated with, or enhanced by vocal information, we may begin to understand the problems that occur when this process is disrupted as it may be in language disorders and in autism spectrum disorders.
The integration of face and voice information, as well as other audiovisual information is a basic part of many cognitive functions including communication and recognition. Audiovisual integration relies on several brain regions, including language regions in the ventral frontal lobe. In this proposal we will investigate how neurons in the frontal lobe bring together audiovisual information that is social, or stimuli that are non-social.
|Plakke, B; Romanski, L M (2016) Neural circuits in auditory and audiovisual memory. Brain Res 1640:278-88|
|Hwang, Jaewon; Romanski, Lizabeth M (2015) Prefrontal neuronal responses during audiovisual mnemonic processing. J Neurosci 35:960-71|
|Plakke, Bethany; Hwang, Jaewon; Romanski, Lizabeth M (2015) Inactivation of Primate Prefrontal Cortex Impairs Auditory and Audiovisual Working Memory. J Neurosci 35:9666-75|
|Diehl, Maria M; Romanski, Lizabeth M (2014) Responses of prefrontal multisensory neurons to mismatching faces and vocalizations. J Neurosci 34:11233-43|
|Plakke, Bethany; Romanski, Lizabeth M (2014) Auditory connections and functions of prefrontal cortex. Front Neurosci 8:199|
|Plakke, Bethany; Diltz, Mark D; Romanski, Lizabeth M (2013) Coding of vocalizations by single neurons in ventrolateral prefrontal cortex. Hear Res 305:135-43|
|Romanski, Lizabeth M (2012) Integration of faces and vocalizations in ventral prefrontal cortex: implications for the evolution of audiovisual speech. Proc Natl Acad Sci U S A 109 Suppl 1:10717-24|
|Romanski, L M; Hwang, J (2012) Timing of audiovisual inputs to the prefrontal cortex and multisensory integration. Neuroscience 214:36-48|
|Romanski, L M; Diehl, M M (2011) Neurons responsive to face-view in the primate ventrolateral prefrontal cortex. Neuroscience 189:223-35|
|Romanski, Lizabeth M; Averbeck, Bruno B (2009) The primate cortical auditory system and neural representation of conspecific vocalizations. Annu Rev Neurosci 32:315-46|
Showing the most recent 10 out of 15 publications