In our information driven Web-based society, we are all gradually falling victims to information overload. But while sighted people can develop ways to quickly skim Web content in order to get the gist of the information and find what they need, blind users are stymied because they must rely on screen reader software with limited functionality to narrate content using computer-generated speech through a serial audio interface, which does not allow these users to find out what content is important before they listen to it. So, they either listen to all content or listen to the first part of each sentence or paragraph before skipping to the next. The PI's goal in this project is to address this problem by developing novel interfaces and algorithmic techniques for non-visual skimming that will empower people with visual impairments to access information on the Web significantly faster than is currently possible with state-of-the-art screen readers.

When skimming, sighted people quickly look through content and pick out keywords and relevant phrases. The PI's approach is to emulate this process and enable a computer-assisted skimming experience for screen-reader users. To this end, he will iteratively and concurrently pursue two research directions: designing interfaces for non-visual skimming, and developing algorithms to enable these interfaces. Through a process of participatory design interfaces will be derived for skimming with standard shortcut-driven screen-readers, touch-based devices, and simulated haptic surfaces, while algorithms for generating summaries will be created to support skimming of various types of Web content at different levels of granularity and speed. Controlled and in-situ real-world experiments will be conducted to evaluate the utility of the resulting interfaces and algorithms. Project outcomes will include an open-source skimming tool and reusable datasets of Web pages with annotations and summaries.

Broader Impacts: Technology to facilitate non-visual skimming will transform information access for visually impaired users while also contributing to the fields of natural language processing, machine learning, and Web information retrieval. Additionally, this research will help us better understand how a combination of touch and haptic interfaces can improve website navigation and skimming. Although not explored in this research, project outcomes will likely prove useful to people with other disabilities such as cognitive and motor impairments, and they may ultimately also be helpful to sighted people.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1218570
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2012-10-01
Budget End
2016-09-30
Support Year
Fiscal Year
2012
Total Cost
$500,000
Indirect Cost
Name
State University New York Stony Brook
Department
Type
DUNS #
City
Stony Brook
State
NY
Country
United States
Zip Code
11794