The long-term goal of this research is to understand the neural representations that guide speech production and perception. While speech and language are central features of human behavior, the neural mechanisms of speech processes remain poorly understood. We will build on recent advances in the neural control of visually-guided reaching to develop analogous models for speech processing.
Aim 1 will employ a pseudo- word repetition identification paradigm to test the hypothesis that speech production is encoded in neural activity using an auditory-articulatory representation.
Aim 2 will employ a pseudo-word identification paradigm to test the hypothesis that speech perception is encoded using an auditory-articulatory representation.
Aim 3 will attempt to replicate findings in Aims 1 and 2 employing a complementary set of pseudo-words. Neural activity will be directly recorded using intracranial electrodes in human patients with pharmacologically- intractable epilepsy. I predict that neural activity in specific cortical areas supporting speech and language will exhibit invariances for particular auditory-articulatory mappings. This will be evidence that speech is processed jointly in auditory space and articulatory space and could explain the specific role of auditory signals in speech production and articulatory signals in speech perception.
Augmentative and alternative communication systems give people with severe communication disorders the ability to communicate. Even after a severe loss of motor function, people can maintain the ability to express themselves with language if they have a communication channel. The long-term goal of this research is to give people with severe motor disorders the ability to speak and communicate by decoding the neural representations that support speech and language processing.
|Cogan, Gregory B; Thesen, Thomas; Carlson, Chad et al. (2014) Sensory-motor transformations for speech occur bilaterally. Nature 507:94-8|