This project will contribute to the development of new forms of mobile human-computer interaction that can be called "microinteractions." These are short interactions that, in some cases, are performed almost without conscious thought, comparable to checking the time on a wristwatch by a mere glance. Such microinteractions are already major uses of mobile computing and communication technologies: being reminded of an appointment, displaying the caller ID of an incoming call, noting a phone number or URL, making an appointment, or receiving a short e-mail. Yet few devices support microinteractions well, leading to a lack of use of the functionality, and there exist few design guidelines for microinteraction interfaces. This research will increase the fundamental knowledge and the design principles required to achieve substantial advances in this emerging field.

The research will investigate barriers to microinteractions, create design guidelines for microinteraction interfaces, create a MAGIC (Mobile Action and Gesture Interface Control) toolkit that enables gesture-based microinteraction techniques, and produce and study prototypes that instantiate the research results. The project will concentrate on the wristwatch as a socially acceptable, fast-to-access interaction platform. Previous work suggests that access time - the time to retrieve a device and navigate through its interface - is a major contributor to "balking," or deciding not to use an interface. The research will quantify the effects of access time on balking, create design guidelines on appropriate access times for typical microinteraction tasks, and give concrete suggestions for minimizing balking by quantifying access times for the wristwatch as compared to other typical device mounting locations such as the pocket or belt. Specifically, four types of microinteractions will be examined: glances, computer-initiated events, user-initiated events, and a new style of interaction, "dual purpose speech," where key phrases in the user's conversation simultaneously trigger actions on the mobile device.

The project will create wristwatch-based gesture, touchscreen, and speech system prototypes that support each of the microinteraction types. This work will encourage new directions in mobile interface research. The extensive involvement of students in the project will allow them to develop a wide range of hardware and software prototyping skills.

Project Report

Checking the time on a wristwatch is an example of a microinteraction - an interaction that provides the wearer with an important service with minimal disruption both in time and effort. Might other services such as texting, weather, email, or calendar reminders be made into microinteractions with the appropriate interface? This work examined such interfaces with a focus on how gestures may enable fast interaction. These gestures could be made on the surface of a touch screen watch (see figure) or could be made with arm movements and sensed with accelerometers or other sensors in a wristwatch. A major difficulty with using gestures to initiate a microinteraction is that such gestures might trigger falsely throughout a user's day, especially as the user is on-the-go. To counter this problem, we made the MAGIC toolkit which can predict which gestures will perform best in everyday use. The toolkit can even suggest good gestures automatically. While designed for on-the-go interfaces, MAGIC can be used with other gesture devices, such as the Kinect depth-sensing camera. A second difficulty in creating microinteractions is in providing alerts and feedback for the user without causing distraction. We investigate using haptic feedback with wristwatches to minimize visual-manual attention. The principles, algorithms, and tools developed in this work may be used to help create interfaces for future wearable computing devices, such as smart watches (for example, the Pebble) or head-up displays (for example, Google Glass). In education, this project involved classes in Georgia Tech College of Computing's Device Thread, such as CS 3651 Prototyping Intelligent Applications and CS 4750 Mobile and Ubiquitous Computing, to develop rapid prototyping skills in graduate and undergraduate students in the field of mobile computing.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
0812281
Program Officer
William Bainbridge
Project Start
Project End
Budget Start
2008-09-01
Budget End
2012-08-31
Support Year
Fiscal Year
2008
Total Cost
$495,999
Indirect Cost
Name
Georgia Tech Research Corporation
Department
Type
DUNS #
City
Atlanta
State
GA
Country
United States
Zip Code
30332