The Cyberlearning and Future Learning Technologies Program funds efforts that will help envision the next generation of learning technologies and advance what we know about how people learn in technology-rich environments. Development and Implementation (DIP) Projects build on proof-of-concept work that showed the possibilities of the proposed new type of learning technology, and project teams build and refine a minimally-viable example of their proposed innovation that allows them to understand how such technology should be designed and used in the future and that allows them to answer questions about how people learn, how to foster or assess learning, and/or how to design for learning. This proposal uses advances in multimodal immersive interfaces, such as sensors that allow motion detection (like the Microsoft Kinect) to examine questions about how learners think with their bodies as they make sense of science concepts like 'scale' or 'rates of change'. The project will help create "simulation theatres for embodied learning," or rooms with immersive technology that allow students to interact with science simulations and simultaneously express ideas by moving their bodies. Research studies will examine whether gestures students use carry over from one science discipline to the next, and whether this type of interaction helps them transfer what they know in one science domain to others. At the end of the project, we should have 1. a technology platform that can be used to help research how students use gesture to understand science concepts, 2. information about how well this tool supports learning across disciplines, and 3. novel psychology research about how people think with their bodies.

This project seeks to extend and refine emerging theories of embodied learning and embodied design. Embodied interactions have shown promise for increasing learning in specific STEM concepts, but there is less known about how body movement and gesture promote understanding abstract and crosscutting ideas that may facilitate learning transfer. This project examines explicitly whether persistent schemes of embodied interactions with computer simulations make it easier for learners to engage with, and learn from, new simulations of novel STEM topics. This project will also make intellectual advances in computational gesture recognition and processing, for instance through single-instance machine learning algorithms, real-time training, and modeling of paraterized gestures to capture full-body gestures to create a highly flexible gesture-learning environment that will enable training based on individual subjects without having to build a large database of gestures in order to achieve reliable recognition. By developing (1) an easy to use, low cost, and highly reconfigurable system for recognizing learning gestures and (2) an integrated set of learning simulations that rely on embodied interactions to investigate a broad range of STEM topics using consistent interface schemes, the project will be able to investigate how gestural congruency can be used to support learners' conceptions of STEM disciplines. Research studies will use 12-15 middle-school students in the initial phases to help identify candidate gestures for cross-disciplinary gestural metaphors. Three later iterations will use approximately 50 students per iteration to examine whether interacting with the system can engage embodied metaphors that support transfer of learning from the domain of a STEM simulation to other domains, including development of instruments for assessing transfer.

Project Start
Project End
Budget Start
2014-09-01
Budget End
2019-08-31
Support Year
Fiscal Year
2014
Total Cost
$1,349,504
Indirect Cost
Name
University of Illinois Urbana-Champaign
Department
Type
DUNS #
City
Champaign
State
IL
Country
United States
Zip Code
61820