The PI's goal in this project is to establish advanced design strategies for the aural navigation of complex Web information architectures, where users exclusively or primarily listen to, rather than look at, content and navigational prompts. Conventional on-screen visual displays may not work well, if at all, in many situations. The most obvious instances occur when persons who are blind or visually impaired need to use technologies designed for sighted users. A much more common situation, however, occurs with users of mobile devices. These users are often engaged in another activity (e.g., walking around a city or driving a car) where it is inconvenient, distracting or even dangerous to continuously look at the screen. On the one hand, Web accessibility guidelines have focused on ensuring that websites are readable by assistive technologies (screen readers). On the other hand, recent aural browsers enable a more intelligent visual-to-aural transformation of specific features of Web pages. These advances, however, do not fully address the larger issue of aurally navigating complex information architectures. Previous work by the PI has shown that an effective aural experience requires the elaboration of new navigation patterns in order to overcome the limits imposed by the linearity of the aural medium. The PI will build on these preliminary results in the current project, in order to provide an advanced level of usability for audio-based web interactions. He will explore conceptual design patterns for aural navigation, by iteratively creating and refining aural design strategies inspired by the structural paradigms of human dialogues for back and history navigation, and browsing in large collections. A series of evaluation studies involving both visually impaired participants using screen readers and sighted participants using mobile devices will assess the potential and limits of the aural navigation paradigms to enhance the effectiveness of Web navigation. . Broader Impacts: This project will directly involve blind users in the design and evaluation of new aural design strategies, in collaboration with the Indiana School for the Blind and Visually Impaired (ISBVI). Project outcomes will yield a better understanding of the design issues and solutions for aural navigation, which will provide a solid long-term intellectual basis for the creation of better applications for visually-impaired users and for audio-only navigation contexts. In addition, the PI will work proactively to expand student participation in the research enterprise on his campus, by establishing user experience undergraduate labs.

Project Report

The objective of this project is to establish and evaluate design strategies for aural navigation in complex information architectures that can provide an advanced level of usability for audio-based web interactions. The rationale for our work is that a more complete understanding of the design issues and corresponding solutions involved in aural navigation provides a solid, long-term intellectual basis for the creation of substantially better applications for visually-impaired users and for audio-only usage contexts. The project generated four major results: 1. TOPIC- AND LIST-BASED BACK BROWSING SHORTCUTS FOR AURAL NAVIGATION FOR MOBILE AND SCREEN-READER USERS. Aural navigation is crucial for two contexts. First, it benefits mobile users navigating the web while unable to look at the screen, generally while engaged in a secondary task (such as walking, running or driving); second, it also enables the blind and visually impaired to browse a website more effectively byleveraging their auditory senses. Most importantly, the basic function of "back" navigation is very inefficient in the aural mode because it forces users to listen to part of the content of each previous page to retrieve the desired content. To address this problem, we introduced topic- and list-based "back", two strategies to enhance aural web browsing while on-the-go. In an experiment with mobile phone users, study participants navigating with topic and list-based "back" completed tasks respectively 18% and 25% faster – and reported a better navigation experience – than those using the back button provided by the web browser. We hypothesized that the same strategies enhancing aural mobile navigation could also benefit visually-impaired users in browsing the web with screen readers (text-to-speech software that "reads aloud" a web page). In our study on topic- and list-based back conducted at the Indiana School for the Blind and Visually Impaired (IBSVI), blind users leveraging topic-based back reached previously visited pages 40% faster than those who relied on existing mechanisms. Participants navigating with list-based back completed tasks 79% faster than those who used the traditional back mechanisms. 2. AURAL FLOWS FOR MOBILE WEB BROWSING. Because large websites exhibit a complex hypertextual and hierarchical structure, simply "reading aloud" a mobile website weakens the user’s orientation in the site information architecture. To address this problem, we introduced techniques to linearize the information architecture of a website into aural flows: dynamic, user-controllable concatenations of web content to be listened to on-the-go, instead of looked at, inspired to the notion of "playlist" for the music experience. To test this, we contributed and evaluated ANFORA (Aural Navigation Flows on Rich Architectures), a novel framework and systemprototype that generates real-time aural flows (rendered through text-to-speech) from existing websites, thus optimizing them for the user experience on-the-go. For example, with ANFORA users can browse the latest news on the go from NPR in semi-aural mode (listening to and occasionally looking at the screen), and conveniently consume web content as audio playlists. The results of this work informed several referred papers, public showcases, and a provisional U.S.patent. 3. AURAL FAST BROWSING WITH GUIDED TOURS. Navigating back and forth from a list of links (index) to its target pages is common on the web, but tethers screen-reader users to unnecessary cognitive and mechanical steps. This problem worsens when indexes lack information scent: cues that enable users to select a link with confidence during fact finding. We investigated how blind users who navigate the web with screen readers can bypass a scentless index with guided tours: a much simpler browsing pattern that linearly concatenates items of a collection. In a study at the Indiana School for the Blind and Visually Impaired (ISBVI), guided tours lowered user’s cognitive effort and significantly decreased time-on task and number of pages visited when compared to an index with poor information scent. 4. AURAL GLANCING FOR DIRECT ACCESS TO KEY PAGE SECTION DURING SCREEN-READER NAVIGATION. Whereas glancing at a web page is crucial for navigation, screen readers forceusers to listen to content serially. This hampers efficient browsing of complex pages and maintains an accessibility divide between sighted and screen-reader users. To address this problem, we adopt a three-pronged strategy: (1) in a user study, we identified key page-level navigation problems that screen-readerusers face while browsing a complex site; (2) through a crowd-sourcing system,we prioritized the most relevant sections of different page types necessary tosupport basic tasks; (3) we introduced DASX, a navigation approach that augments the ability of screen-reader users to "aurally glance" at a complex page by accessing at any time the most relevant page sections. In a preliminary evaluation, DASX markedly reduced the gap in page navigation efficiency between screen-reader and sighted users. The project results provided a solid research basis to empower people who need to navigate the web mainly through audio interfaces, and to enable people with visual impairments to participate more fully in the Information Society.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
1018054
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2010-08-01
Budget End
2014-07-31
Support Year
Fiscal Year
2010
Total Cost
$472,311
Indirect Cost
Name
Indiana University
Department
Type
DUNS #
City
Bloomington
State
IN
Country
United States
Zip Code
47401