We propose to develop and evaluate a cell-phone-based system to enable blind and visually impaired individuals to find and read street signs and other signs relevant to wayfinding. Using the built-in camera and computing power of a standard cell phone, the system will process images captured by the user to find and analyze signs, and speak their contents. This will provide valuable assistance for blind or visually impaired pedestrians in finding and reading street signs, as well as locating and identifying addresses and store names, without requiring them to carry any special-purpose hardware. The sign finding and reading software will be made freely available for download into any camera-equipped cell phone that uses the widespread Symbian operating system (such as the popular Nokia cell phone series). We will build on our prior and ongoing work in applying computer vision techniques to practical problem-solving for blind persons, including cell-phone implementation of algorithms for indoor wayfinding and for reading digital appliance displays. We will develop, refine and transfer to the cell phone platform a new belief propagation-based algorithm that has shown preliminary success in finding and analyzing signs under difficult real-world conditions including partial shadow coverage. Human factors studies will help determine how to configure the system and its user controls for maximum effectiveness and ease of use, and provide an evaluation of the overall system. Access to environmental labels, signs or landmarks is taken for granted every day by the sighted, but approximately 10 million Americans with significant vision impairments and a million who are legally blind face severe difficulties in this task. The proposed research would result in a highly accessible system (with zero or minimal cost to users) to augment existing wayfinding techniques, which could dramatically improve independent travel for blind and visually impaired persons.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY018210-02
Application #
7911722
Study Section
Special Emphasis Panel (ZRG1-BDCN-F (12))
Program Officer
Wiggs, Cheri
Project Start
2009-09-01
Project End
2012-08-31
Budget Start
2010-09-01
Budget End
2012-08-31
Support Year
2
Fiscal Year
2010
Total Cost
$423,145
Indirect Cost
Name
Smith-Kettlewell Eye Research Institute
Department
Type
DUNS #
073121105
City
San Francisco
State
CA
Country
United States
Zip Code
94115
Manduchi, Roberto; Coughlan, James (2012) (Computer) Vision without Sight. Commun ACM 55:96-104
Sanketi, Pannag; Shen, Huiying; Coughlan, James M (2011) Localizing Blurry and Low-Resolution Text in Natural Images. Proc IEEE Workshop Appl Comput Vis 2011:503-510
Sanketi, Pannag R; Coughlan, James M (2010) Anti-Blur Feedback for Visually Impaired Users of Smartphone Cameras. ASSETS 2010:233-234