For people with low-vision impairments, the smartphone has become inextricably tied to their daily lives just like the general population. It is their go-to assistive device, relying on the smartphone?s built-in screen magnifier - Zoom on the iPhone and Magnifier on Android - to interact with it. But the usability of these screen magnifiers falls woefully short, adversely affecting productivity. First, the magnifier indiscriminately magnifies the raw screen pixels, including whitespace, as a blanket operation, causing occlusion of important contextual information such as visual cues from the user?s viewport. This necessitates panning over these occluded portions and mentally reconstructing the contextual information necessary for interacting with the content elements. Second, magnification gestures such as the bimanual multi-tap and multi-finger touch gestures tend to be more complex than the basic 1-finger swipe. The complexity of these touch gestures makes them cumbersome and tiring to use. Remembering the entire repertoire of gestures is also difficult. Since all these touch gestures involve some subset of finger combinations, it is easy to mix up one for the other. Third, virtual keyboards, which takes up significant screen real estate as-is, is difficult to use for text entry and editing in magnified view. Either the entire screen area is magnified including the virtual keyboard or only the display area. In the former, some of the keys are occluded from the view whereas in the latter the keys remain unmagnified. Regardless, key presses are hard to do in either cases. In sum, all these usability issues contribute to a vastly disproportionate gap in user experience and productivity between people with and without low-vision impairments. This proposal seeks to develop the next generation screen magnifier that will bridge this wide gap in user experience. It is rooted on three novel ideas. First, instead of indiscriminately magnifying the screen content as is done now, it will do object-aware magnification by identifying the objects in the graphical interface and compacting the space between the objects so as to keep contextually related objects close together in the magnified view. Second, by leveraging the untapped built-in sensors such as accelerometer, geometric field and barometric pressure sensors, it will expand the default surface gestures to include surfaceless natural gestures for magnification operations that can be done with only one hand, thus freeing the other hand for other tasks. More importantly, these gestures will be easy-to-do and easy-to-learn and recall. Third, it will incorporate a novel keyboardless gesture-based text entry and editing technique to eliminate the difficulties that arise with virtual keyboards for text entry in magnification mode. These three ideas will inform the development of CxZoom, a transformative, next-generation smartphone screen reader for low vision. CxZoom will make interaction with smartphones far more usable for people with low vision, thereby eliminating any barriers to their productivity and empower them to utilize the power and connectivity of these devices to fully participate in this digitized economy.

Public Health Relevance

This project will research and develop the next generation smartphone screen magnification technology to eliminate the productivity-hampering usability barriers present in current built-in screen magnifiers which are causing a wide gap in user experience between those with and without low-vision disabilities. Eliminating such barriers will bridge this wide gap in user experience and empower low-vision smartphone users to utilize the power and connectivity of these devices to fully participate in this digitized economy just as those with no disabilities and thus advance the vision of Universal Accessibility, namely, that anyone should be able to reap the benefits of the information age with any computing device, unconstrained by any disability. .

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY030085-02
Application #
10116401
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
Wiggs, Cheri
Project Start
2020-03-01
Project End
2023-02-28
Budget Start
2021-03-01
Budget End
2022-02-28
Support Year
2
Fiscal Year
2021
Total Cost
Indirect Cost
Name
State University New York Stony Brook
Department
Type
DUNS #
804878247
City
Stony Brook
State
NY
Country
United States
Zip Code
11794