This project is investigating the use of head-mounted augmented reality (AR) to improve learning outcomes among deaf and hard of hearing learners in situations that make learning logistically-challenging for them, specifically presentation situations where there is also some scenario that needs to be focused on visually. The work is being carried out in planetaria, where learners wear a monocle that displays a signer in a way that allows the learner to look at both the signed interpretation of the presentation and the scenario of interest at the same time. The design of the technology and way it is being used is informed by the literature on cognitive load and by literature on multimedia learning theory (Mayer, 2005). Results are applicable to a wide variety of logistically-challenging situations for deaf/hoh learners, including the kinds of informal learning venues that often excite the passions of hearing learners and perhaps in classrooms as well.

Presentations, even when a signer is available, are often logistically-difficult for the deaf and hard-of-hearing population to take advantage of well. Moving attention back and forth between the interpreter to the objects or scenarios being described makes it difficult to follow a presentation and get everything out of it that a hearing person can get. This project is aiming to ameliorate this problem by designing technology that will project the interpreter's signs in the same field of vision as the object or scenario being discussed and learning how to use that technology well.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1124548
Program Officer
Janet L. Kolodner
Project Start
Project End
Budget Start
2011-09-01
Budget End
2014-08-31
Support Year
Fiscal Year
2011
Total Cost
$300,000
Indirect Cost
Name
Brigham Young University
Department
Type
DUNS #
City
Provo
State
UT
Country
United States
Zip Code
84602