There are currently very few ways for the blind to navigate a new indoor space without the assistance of a fully-sighted person. The technology proposed by this project is designed to enable a visually-impaired individual to find their way through large indoor environments such as airports, train stations and shopping malls by recognizing semantic and salient visual features of the environment. There is no prior visit or mapping of the environment required, and there is no need to deploy or utilize any special infrastructure like WiFi access points or infrared beacons. Researchers plan to use publically available architectural lay-outs and information about the location of ships, tracks, gates and other visual cues. The platform is a cell-phone mounted on a necklace that provides turn-by-turn directions through an audio-voice command interface. This technology is designed to process video from the cell phone camera in real-time using text and logo detection, localization based on prior knowledge of the layout and integration of accelerometer and visual odometry.

The blind and visually-impaired population in the United States is large and expected to grow in the future. If successfully implemented, this technology could have broader reaching applications, including many location-based services such as aiding those with spatial learning difficulties or guiding users to a specific location. The project team has the expertise required to develop this technology at a relatively rapid rate and economical cost.

Project Report

Independent mobility is a critical function of humans and it is almost universally taken for granted by fully sighted people. This process involves localization, navigation, and obstacle avoidance. While obstacle avoidance might be partially solved using canes or guide dogs, and outdoor navigation is facilitated by GPS systems, indoor navigation remains an intractable challenge. The I-Corps team proposes a new product, BlindNav, which enables a visually impaired person to find her way in large indoor environments like airports, train stations, and malls by recognizing semantic and salient visual features of the environment. No prior visits to the train station or mapping of the station with SLAM techniques is needed. Instead architectural lay-outs available publicly and information about location of shops, tracks, gates, and other visual cues are used. The platform is a cell-phone mounted on a necklace and providing turn by turn directions through a voice interface. BlindNav processes video from the cell phone camera in real-time using three main technologies: text and logo detection, localization based on prior knowledge of the layout (like vertical lines), integration of vanishing point detection and accelerometer data, and short-range visual odometry to mitigate hard visibility conditions. There is currently no way for a blind person to navigate in a new space without the assistance of a fully-sighted person. The team aims to revolutionize the world for blind people by providing an unprecedented independence. The integration of "accessibility" software such as audio-feedback touch screens has put mobile technology such as smartphones and tablets in the hands of the visually impaired. The team has partnered with several members of the blind community towards developing a simple, intuitive audio interface and furthermore, a grounds root adoption. BlindNav is scheduled for nationwide commercial launch in October 2013. Currently, in pre-alpha development, the team is leveraging experience in computer vision to port cutting-edge, proven research to the cell phone. Because, the key to BlindNav’s success is The team’s integration filter which combines multiple open source computer vision tools towards localization, all IP is owned by the company. The team has implemented text detection and vertical line detection on the cell phone to date. With the revolution of Apple's universal access software-which makes mobile software usable for the disabled, other companies have followed suit, thus changing the entire landscape of assistive technology. The team's product would be the only product of its kind, and would offer a unique and elegant solution an affordable price.

Agency
National Science Foundation (NSF)
Institute
Division of Industrial Innovation and Partnerships (IIP)
Type
Standard Grant (Standard)
Application #
1265129
Program Officer
Rathindra DasGupta
Project Start
Project End
Budget Start
2012-10-01
Budget End
2014-03-31
Support Year
Fiscal Year
2012
Total Cost
$50,000
Indirect Cost
Name
University of Pennsylvania
Department
Type
DUNS #
City
Philadelphia
State
PA
Country
United States
Zip Code
19104