This project examines how American English speakers learn a case-marking, flexible word-order language as a second language under different context conditions (with supporting images or translations). It links the real time processing of the new language, as learners read sentences, to the learning outcomes. Learning languages whose structures are different from English poses great difficulty for learners. This project examines the origins of this attested difficulty, and how visual information (images) can guide learners’ attention and help them notice and understand new grammatical structures. It considers the role of the learning environment for grammar learning and can illustrate the way language input is better integrated with non-linguistic, multimodal contextual support. It can guide educators and policy makers in the development of educational software and game design, and online language learning. It can inform the design of successful language programs for U.S. adult learner populations.

The studies include a language learning and a subsequent testing phase. During the learning phase, self-paced reading and eye-tracking will show where learners allocate their attention as they read second language sentences, and how attention to different parts of the sentence is modulated by the type of contextual support (images or translations). The goal is to examine how linguistic and visual information interact and compete for learners’ attention. The hypothesis is that visual scenes can make some aspects of grammar ‘stand out’ as learners compare their sentence interpretation to the visual information. During the testing phase, multiple measures will assess participants’ ability to comprehend and produce the new grammatical structure. These studies can show how multimodal input influences the focus of learners’ attention during real-time sentence reading and how this processing affects their learning. This will advance our understanding of how the mind processes and integrates linguistic and visual information when learning a second language, which has clear educational applications especially in online and multimedia language learning.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2020-07-15
Budget End
2022-06-30
Support Year
Fiscal Year
2020
Total Cost
$16,409
Indirect Cost
Name
University of Illinois Urbana-Champaign
Department
Type
DUNS #
City
Champaign
State
IL
Country
United States
Zip Code
61820