This project uses methods from artificial intelligence (AI) to better understand how people learn visuospatial reasoning skills like mental rotation, which are a critical ingredient in the development of strong math and science abilities. In particular, this project proposes a new approach to quantify the learning value contained in different visual experiences, using wearable cameras combined with a new AI system that learns visuospatial reasoning skills from video examples. Results from this project will not only advance the state of the art in AI but also will enable researchers to measure how valuable different real-world visual experiences are in helping people to learn visuospatial reasoning skills. For example, certain types of object play activities might be particularly valuable for helping a child to learn certain visuospatial reasoning skills. Ultimately, this new measurement approach could be used to identify early signs of visuospatial reasoning difficulties in children and could also help in the design of new visuospatial training interventions to boost children's early math and science development.

The core scientific question that this project aims to answer is: How are visuospatial reasoning skills learned from first-person visual experiences? This question will be answered through computational experiments with a new AI system---the Mental Imagery Engine (MIME)---that learns visuospatial reasoning skills, like mental rotation, from video examples. Training data will include first-person, wearable-camera videos from two different settings that are both important for human learning: unstructured object manipulation by infants and visuospatial training interventions designed for children. Results from experiments with the MIME AI system will advance the state of the art in both AI and the science of human learning by helping to explain how visuospatial reasoning skills can be learned from visual experiences, and, in particular, how having different kinds of visual experiences can affect the quality of a person's learning outcomes in different ways.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Type
Standard Grant (Standard)
Application #
1730044
Program Officer
Soo-Siang Lim
Project Start
Project End
Budget Start
2017-08-15
Budget End
2020-07-31
Support Year
Fiscal Year
2017
Total Cost
$200,000
Indirect Cost
Name
Vanderbilt University Medical Center
Department
Type
DUNS #
City
Nashville
State
TN
Country
United States
Zip Code
37235