Augmented Reality (AR) allows placing virtual objects in a real environment. AR glasses are the next wave of human-computer interaction technology. This technology will have a direct impact on education, manufacturing, national defense, and assembly jobs, among others. Understanding the role of human interaction with an AR system using gestures, speech, or the combination of both (gesture and speech) will provide foundational knowledge to decrease the complexity of user interaction. This research will have an impact in society by supporting underrepresented and non-traditional graduate students and developing new interdisciplinary course. The outcomes of this research will facilitate creation of improved AR applications that benefit several fields such as future workforce training. The datasets generated through this project can result in collaborative explorations between computer scientists and cognitive/learning scientists to make learning AR applications more accessible and incorporate more intuitive interactions.

This project will generate preliminary, foundational research on gestures for AR three-dimensional (3D) user interfaces, design egocentric gestures with/without speech interactions with AR headsets, construct two labeled datasets for gesture recognition, and develop a block-based, gesture-enabled application for AR headsets. This project will advance the state of the art in gesture interaction for AR headsets by: (1) Elicitation Studies: This project will conduct the largest elicitation study to date, substantially increasing the foundational knowledge base and improving elicitation methodology and gesture recognizers. This study will require creating a novel application that will generate new collaborative interaction datasets, resulting in better foundational knowledge about user gestures for AR 3D user interfaces via gesture elicitation; (2) Dataset Generation: The elicitation studies will produce labeled and unlabeled egocentric datasets that will enable human-computer interaction and computer vision researchers to explore new recognition algorithms; (3) Multi-modal interaction (gesture and speech) interaction in AR Headsets: The preliminary research conducted as part of this project will allow research in complex interactions and collaborative tasks using AR Headsets. This research will include conducting egocentric gesture studies (with/without speech) to determine appropriate interaction for AR headset applications. The project?s research activities will broaden our understanding and the use of elicitation studies beyond action/gesture mapping, as well as the multi-modal interactions (gesture and speech) in AR Headsets, either validating findings in human-to-human communication or discovering new ones. This research is applicable in domains using AR Headsets with intuitive user controls for interactive applications, such as industry, manufacturing, aviation, education, entertainment, energy, and defense.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1948254
Program Officer
Balakrishnan Prabhakaran
Project Start
Project End
Budget Start
2020-08-01
Budget End
2022-07-31
Support Year
Fiscal Year
2019
Total Cost
$191,000
Indirect Cost
Name
Colorado State University-Fort Collins
Department
Type
DUNS #
City
Fort Collins
State
CO
Country
United States
Zip Code
80523