Sign language is vital to the lives and well-being of many deaf people in the United States who rely on this mode of communication. However, current theories in linguistics, psychology, and cognitive neuroscience have all been developed primarily from investigations of spoken language. The focus on speech leaves open many critical questions about linguistic phenomena that are fundamentally shaped by the visual-manual modality. NIDCD recognizes these gaps in our understanding ? the Strategic Plan states, ?Enhancing our understanding of visual-manual language systems opens a window into general human cognition (pg. 32).? This project aims to develop a neurobiological model of sign language perception and comprehension that takes into account essential modality-specific phenomena: phonology without sound, lexical iconicity (non-arbitrary mappings between the form and meaning of signs), and `spatial syntax' (the use of locations in signing space to express grammatical roles and co-reference).
Aim 1 of the project is to map the cortical representation of phonological structure in American Sign Language (ASL) using functional Magnetic Resonance Imaging (fMRI). fMRI adaptation methods will be used to investigate whether body-selective neural regions become tuned to linguistic handshapes and body locations in signers. In a second fMRI study, Phonological Neighborhood Density in ASL will be manipulated to identify neural areas that support lexical-level phonological representations.
Aim 2 of the project is to identify the impact of iconicity on lexical representations and learning. Event-Related Potentials (ERPs) will be used to assess whether effects of iconicity are task dependent (e.g., related to the use of picture stimuli) or reflect distinct neural representations for iconic signs (e.g., more robust encoding of sensory-motor semantic features). ERPs will also be used to investigate whether brain responses to iconic signs change with learning and whether learners represent signs as wholistic gestures or are sensitive to internal phonological structure.
Aim 3 of the project is to identify the neural network involved in comprehending `spatial syntax'. fMRI will be used to determine whether right parietal cortex is engaged when understanding ASL verbs that are directed toward locations in signing space (R-loci) to indicate grammatical roles. A second fMRI experiment is designed to identify the neural correlates of spatial indexing (the establishment and maintenance of R-loci). Overall, the project aims to enhance our understanding of the neurobiology of visual-manual language, which will provide a translational foundation for treating injury to the language system and for diagnosing language impairments in deaf individuals.

Public Health Relevance

This research will benefit individuals who are deaf and use sign language as their primary language and will support the medical and rehabilitative organizations that serve this population. The findings will advance knowledge of the neural underpinnings of sign language, providing crucial information to clinicians and neurologists who are treating deaf patients with brain injury or disease.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
2R01DC010997-41
Application #
10120531
Study Section
Language and Communication Study Section (LCOM)
Program Officer
Cooper, Judith
Project Start
1979-07-01
Project End
2025-11-30
Budget Start
2020-12-01
Budget End
2021-11-30
Support Year
41
Fiscal Year
2021
Total Cost
Indirect Cost
Name
San Diego State University
Department
Other Health Professions
Type
Sch Allied Health Professions
DUNS #
073371346
City
San Diego
State
CA
Country
United States
Zip Code
92182
Blanco-Elorrieta, Esti; Emmorey, Karen; Pylkkänen, Liina (2018) Language switching decomposed through MEG and evidence from bimodal bilinguals. Proc Natl Acad Sci U S A 115:9708-9713
Mooney, Aimee; Beale, Naomi; Fried-Oken, Melanie (2018) Group Communication Treatment for Individuals with PPA and Their Partners. Semin Speech Lang 39:257-269
Giustolisi, Beatrice; Emmorey, Karen (2018) Visual Statistical Learning With Stimuli Presented Sequentially Across Space and Time in Deaf and Hearing Adults. Cogn Sci 42:3177-3190
Majid, Asifa; Roberts, Seán G; Cilissen, Ludy et al. (2018) Differential coding of perception in the world's languages. Proc Natl Acad Sci U S A 115:11369-11376
Blanco-Elorrieta, Esti; Kastner, Itamar; Emmorey, Karen et al. (2018) Shared neural correlates for building phrases in signed and spoken language. Sci Rep 8:5492
Emmorey, Karen (2018) Variation in late L1 acquisition? Biling (Camb Engl) 21:917-918
Caselli, Naomi K; Sehyr, Zed Sevcikova; Cohen-Goldberg, Ariel M et al. (2017) ASL-LEX: A lexical database of American Sign Language. Behav Res Methods 49:784-801
Emmorey, Karen; Giezen, Marcel R; Petrich, Jennifer A F et al. (2017) The relation between working memory and language comprehension in signers and speakers. Acta Psychol (Amst) 177:69-77
Emmorey, Karen (2016) Consequences of the Now-or-Never bottleneck for signed versus spoken languages. Behav Brain Sci 39:e70
Emmorey, Karen; Mehta, Sonya; McCullough, Stephen et al. (2016) The neural circuits recruited for the production of signs and fingerspelled words. Brain Lang 160:30-41

Showing the most recent 10 out of 31 publications