Touchscreen interfaces are becoming increasingly prevalent as the interface with which people interact with computers and yet, for people with motor impairments, many touchscreen commands are difficult or impossible to execute. With the increased deployment of touchscreen interfaces, it becomes critically important for hardware and software developers to ensure that such devices are accessible to a broad range of users. While these challenges can be partially offset by multimodal (speech) input, touch and gesture remain necessary for fully functional, efficient, and socially acceptable use of many touchscreen devices. This is a serious concern for the almost 20 million people in the U.S. who have motor impairments that affect their upper body, a number that will only rise with the unprecedented increase in America's senior population.

This project pursues a research program to advance a fundamental understanding of how decreased motor ability impacts touchscreen interaction and, in turn, how touchscreen interactions can be personalized to support each user's abilities. While substantial user-interface-development effort has focused on personalizing content, personalized-interaction interfaces, such as to alter the means of issuing the taps, swipes, and clicks that underlie the use of touchscreen devices, have received much less attention. The increased use of touchscreens presents a tremendous opportunity for software-based modifications because the entire interactive surface is software-controlled, an advantage that this researcher has already leveraged to adapt touchscreen keyboards to how motor abilities for people without disabilities change in some situations such as while walking. This project goes far beyond this preliminary work to provide great benefits to people with permanent motor impairments.

The project consists of two complementary major activities. The first major activity employs large-scale studies to reliably assess and predict the impact of motor abilities on touchscreen interaction. By developing new methods to leverage user-generated content (e.g., videos, tweets) and by employing large-scale online experimentation, the large-scale studies will provide a more in-depth and ecologically valid characterization of how motor ability impacts touchscreen use than has been previously possible. The second major activity of the project builds on findings from the first major activity to design and evaluate new approaches for personalizing touchscreen interaction. The second activity will contribute new techniques for personalizing mobile interactions, generate new algorithms and predictive models of touchscreen performance, and identify design guidelines for personalizing mobile interaction.

Broader Impacts: This work will transform mobile accessibility for people with motor impairments. Enabling mobile access can lead to greater empowerment and independence for people with disabilities. Many of the proposed techniques will also likely benefit users more broadly. This work also has implications for the accessible design of the next generation of mobile devices, including wearables and 3D-gesture interfaces. Many of the techniques should be applicable in a commercial context, which is important to insure that new commercial interfaces are accessible to all users. Education plans include two courses related to accessibility, one of which establishes a partnership with the DC Public Library on touchscreen training for people with disabilities.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
1818594
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2017-09-01
Budget End
2022-01-31
Support Year
Fiscal Year
2018
Total Cost
$347,570
Indirect Cost
Name
University of Washington
Department
Type
DUNS #
City
Seattle
State
WA
Country
United States
Zip Code
98195