Language learning is a robust process that relies on a child's capacity for learning from the sights, sounds, and experiences in the environment. However, for the roughly 200,000 young children in the US with a visual or hearing disability, language learning proceeds differently, and there are basic questions about the roles of hearing and seeing in language development for which we have no answers. Discovering answers to these questions is critical for supporting individuals with hearing or vision impairments, and those with language delays or deficits, and also for improving the health and educational outcomes of all children. Children who are deaf or hard of hearing and do not receive sign language input or assistive technology (e.g., cochlear implants) show large delays in acquiring language skills. When such children learn sign language through normal visual modality, though, they appear to develop normal language abilities. This suggests that vision can provide a "path" to language. At the same time, other evidence indicates that vision is not strictly necessary for learners with normal hearing, as blind adults have largely normal language. Indeed, these individuals learn color and vision-related words ("see") similarly to sighted peers. Neurologically, their 'language' brain areas work quite similarly, whereas 'vision' areas reorganize. Unlike deaf infants who receive cochlear implants or sign language input, individuals who are blind do not generally receive (nor appear to require) additional input to support language. This is despite the fact that, in typical development, visual experiences have been linked to language learning (e.g. seeing where parents are looking or pointing). This project seeks to provide critical evidence concerning how the brain rewires when infants are born unable to hear or to see, and how early brain development enables language acquisition. This project also has two educational objectives that will increase science knowledge among the lay public and schoolchildren, while building the communication, research, and pedagogical skills of tomorrow's scientists.

Across five studies, this project will examine the knowledge and processing of language in infants who are blind, deaf/hard of hearing, or typically-developing. Given that infants understand words months before producing them, this work focuses on early word comprehension, particularly on words' sounds and meanings, using behavioral, neural, and observational approaches. Real-time language processing will be measured by tracking eye movements and brain waves in response to language stimuli. In another studies, word learning will be tracked through the analysis of naturalistic recordings of parent-child interactions. Together, the results of these approaches will provide critical insights into how children learn and grow with the types of language input they receive, as well as how to improve language learning in special populations. The educational component of this project includes a multi-tiered research communication plan, along with a set of after-school enrichment activities for local K-2nd graders. These activities will focus on teaching students about sensory organs in an engaging, experiential format, with simple demonstrations to facilitate STEM excitement and learning.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Application #
1844710
Program Officer
Tyler Kendall
Project Start
Project End
Budget Start
2019-05-15
Budget End
2024-04-30
Support Year
Fiscal Year
2018
Total Cost
$253,616
Indirect Cost
Name
Duke University
Department
Type
DUNS #
City
Durham
State
NC
Country
United States
Zip Code
27705