To access computers, blind people have to use screen readers and/or Braille displays, which make the content of the computer screen accessible, but require much time and patience because users have to search the screen sequentially for relevant content and controls. The goal of this Project is to develop Feel-IT, a portable hardware-device that will enable blind people to interact with computers much faster by using an audio-tactile (hearing and feeling) interface. Blind people will be able to use portable tactile Feel-IT gloves to “feel” the content of the computer screen while moving their hands on the surface of a regular desk. Combining the tactile and auditory sources of feedback will enable Feel-IT users to process information on the computer screen significantly faster. It will also reduce the information overload because they will no longer have to listen to all content that is on the screen. Tactile feedback will make computers more accessible for blind people, thus enabling them to participate fully in our digital society. The transformative idea of Feel-IT is that it will be a single device that will make all screens (including non-touch screens) of any size accessible with audio-tactile feedback. Feel-IT will be evaluated with blind computer users in real-life scenarios. Feel-IT will enable the study of the strategies that blind people employ for audio-tactile interaction with computer, improve our understanding of the optimal audio-tactile interface design, and provide a design blueprint for future assistive technology devices. This Project will also provide accessibility research training to undergraduate and graduate students, especially those with vision impairments.

The goal of this project is to develop ‘Feel-It,” a new human-computer interface that will improve the ability of blind and low-vision people to interact with computers. The interface will compensate the users’ inability to see the screen content by enabling them to both hear the content and feel its tactile representation with both hands. The portable hardware-device will connect to any computer (touchscreen or not) or smart phone device via Bluetooth, and allow users to interact with their devices by putting their hands in/on a pair of haptic “gloves” and moving them on any flat surface, e.g., a desk or a table. The Research Plan is organized under three objectives. OBJECTIVE 1 is to design and develop the Feel-IT gloves to support high-resolution tactile feedback and portability. Features of the glove include: TULAs (Tiny Ultrasonic Linear Actuators) that are placed on the distal phalanxes of fingers and act similarly to Braille Cells but enable increased resolution and require less voltage/bulk and wires; infrared LED lights on each finger that will enable a camera to track finger position and orientation; a stabilizer that enables more comfortable placement of the hand on the glove and greater control over movements; and buttons to click links or zoom in and out of any segment. OBJECTIVE 2 is to design audio-tactile interfaces that enable general exploration of the computer screen and enable standard office apps. The design will build on a current Interface Manager and Finger Tracker that supports one glove and only enables tactile interaction with web pages. User Interface (UI) Automation and/or standard vision segmentation algorithms will be used to obtain information about the computer screen content and convert it to tactile representation. The interface manager, which will be limited to MS Windows, will represent. in tactile form, open windows, app icons, menu elements, and the content of app windows. The system will enable the user to “zoom” into any window, feel its components, listen to the content, and then zoom out. OBJECTIVE 3 is to conduct human subject research in laboratory and in-situ settings. Eighty low vision and blind subjects will be recruited to conduct user studies on a typical screen with multiple icons, windows, menus, etc., and with typical office applications, e.g., word processors, spreadsheets, etc. Performance measurse include time to perform task, effort to perform task, and failures. Satisfaction measures will be determined by an After Scenario Questionnaire, a System Usability Scale (1-10) and a System Preference Questionnaire. Expected research outcomes include: (1) Design and software implementation of novel haptic interfaces for computer access; (2) Understanding tactile behaviors and strategies employed by blind and low-vision people; and (3) Insight into how dynamic haptic feedback can help improve computer interaction.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2020-06-15
Budget End
2022-05-31
Support Year
Fiscal Year
2019
Total Cost
$210,922
Indirect Cost
Name
State University New York Stony Brook
Department
Type
DUNS #
City
Stony Brook
State
NY
Country
United States
Zip Code
11794