When people talk, they gesture. Gesturing helps speakers to organize their thoughts and words, and also helps listeners to more fully understand the information being conveyed. Instructional techniques have begun to incorporate use of speech-accompanying gestures to facilitate learning. However, it is unclear how and why gestures aid learning. The goal of this research is to investigate whether, and if so how, observing gestures differs from observing actions performed on objects in facilitating young children's word learning. The research employs both behavioral measures of learning and brain imaging techniques to address this issue. The research will ultimately help to optimize learning environments.
This project addresses the hypothesis that gesture's facilitative effect on learning is distinct from effects of observing actions because gestures are more abstract forms of representation, in that they highlight the important components of an action without being tied to a specific learning context. The project explores the ways in which gestures versus actions facilitate word learning in 4 to 5 year old children. The research goals include (1) comparing the impact that learning a word through gesture, compared to action, has on the word's learning trajectory; (2) evaluating how well words are generalized and retained over time after they are learned through gesture or action; (3) exploring the neural mechanisms underlying each of these types of learning experiences. The results will clarify whether and how gesture promotes learning that goes beyond the particular, and extends over time.