American Sign Language (ASL) displays complex linguistic structure, but unlike spoken languages, conveys much of its structure of manipulating spatial relations, thus exhibiting properties for which each of the hemispheres of hearing people shows a different predominant functioning. The study of deaf signers with left or right hemisphere damage offers a particularly revealing vantage point for understanding the organization of higher cognitive functions in the brain, and how modifiable that organization may be. We propose three major series of experimental studies, each bringing to bear a special property of the visual gestural modality on the investigation of the brain for language: 1) Brain Organization for a Visuospatial Language. We explore in depth the nature of sign language impairments due to brain damage. We investigate the relative contributions of the cerebral hemispheres to language, with special reference to the linguistic functions and the spatial mechanisms that convey them. We focus on three levels of structure unique to a language in a different mode: processing 'phonology' without sound, vertically arrayed morphology, and spatially organized syntax. 1) Brain Organization for Visuospatial Functions and Facial Signals. We examine brain organization for nonlinguistic visuospatial processing in deaf signers, evaluating visuoperceptive, visuoconstructive, and visuospatial nonlanguage functions. Additionally we investigate the breakdown of facial signals that serve two very different functions in ASL: affective versus specifically linguistic. 3) Neural Mechanisms for Sign Language. We investigate the dissociability of apraxia from sign aphasia through experiments that separately evaluate impairments in linguistic, symbolic, and motor functions as well as analyze the neural correlates for sign language. This research has broad implications for the theoretical understanding of the neural mechanisms underlying the human capacity for language. Patterns of breakdown of a visual-spatial language in deaf signers allow new perspectives on the nature of cerebral specialization for language, since in sign language there is interplay between visual spatial and linguistic relations within one and the same system.

National Institute of Health (NIH)
National Institute of Neurological Disorders and Stroke (NINDS)
Research Project (R01)
Project #
Application #
Study Section
Communication Sciences and Disorders (CMS)
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
Salk Institute for Biological Studies
La Jolla
United States
Zip Code
Galvan, D (1999) Differences in the use of American Sign Language morphology by deaf children: implications for parents and teachers. Am Ann Deaf 144:320-4
Poizner, H; Bellugi, U; Klima, E S (1990) Biological foundations of language: clues from sign language. Annu Rev Neurosci 13:283-307
Bihrle, A M; Bellugi, U; Delis, D et al. (1989) Seeing either the forest or the trees: dissociation in visuospatial processing. Brain Cogn 11:37-49
Bellugi, U; Poizner, H; Klima, E S (1989) Language, modality and the brain. Trends Neurosci 12:380-8
Corina, D P (1989) Recognition of affective and noncanonical linguistic facial expressions in hearing and deaf subjects. Brain Cogn 9:227-37
Vaid, J; Corina, D (1989) Visual field asymmetries in numerical size comparisons of digits, words, and signs. Brain Lang 36:117-26
Vaid, J; Bellugi, U; Poizner, H (1989) Hand dominance for signing: clues to brain lateralization of language. Neuropsychologia 27:949-60
Bellugi, U; Klima, E S; Poizner, H (1988) Sign language and the brain. Res Publ Assoc Res Nerv Ment Dis 66:39-56