The world is increasingly dominated by multimedia technology for communication, commerce, entertainment, art, education, and medicine. Since modern electronic media are rich in graphical and pictorial information, it has been hard for the population of visually impaired people to keep up. While some of this information can also be presented as speech or Braille text, the ability to directly present graphical and pictorial information in tactile form, in combination with auditory signals, would dramatically increase the amount of information that can be made available to the members of this large community, and would also drive advances in interfaces for diverse applications such as virtual reality and medicine. Although the tactile sense has to date received relatively little attention due to the lack of versatile devices, recent advances in tactile display technology provide impetus for new research. While existing tactile devices are mostly static, there is great promise for dynamic devices based on emerging technologies such as electro-active polymers, pneumatics, and MEMs. Dynamic devices would make it possible to generate and display arbitrary tactile patterns, but to estimate the capabilities of different device configurations it is important to understand and model the device characteristics and how they relate to human perception. It has been shown that the relevant characteristics (material, surface shape) can be simulated using accurate static physical models. This sets the stage for potentially transformative research to enable the presentation of graphical and pictorial information in tactile-acoustic form, by exploiting the capabilities of tactile display devices in combination with the abilities of human tactile and auditory perception. To reach that goal will require the investigation of fundamental issues in tactile perception as it relates to existing devices or the design of new ones, and the study of fundamental relationships among visual, tactile, and auditory perception. The PI's objective in this exploratory project is to conduct preliminary work along these lines in order to establish the feasibility of the approach. To this end, he will develop mathematical models for tactile devices and perception, and conduct experiments to validate them. Research subtasks will include development of algorithms for synthesizing tactile textures, development of structural similarity metrics for visual, tactile, and acoustic textures, and quantitative description of perceptual dimensions of visual, tactile, and acoustic textures. Tests with sighted (visually blocked) and visually-impaired people will measure our ability to discriminate among tactile patterns with and without acoustic feedback, identify dimensions of tactile texture perception (e.g., roughness, directionality), establish that pattern labels can be learned with and without acoustic cues, and explore the brain's ability to integrate tactile information into a scene.

Broader Impacts: This research will contribute to fundamental advances in sense substitution and the use of touch for human-computer interaction. It will address fundamental problems in visual, tactile, and acoustic texture analysis and perception, and the use of touch for communication of graphical and pictorial information. Project outcomes will contribute to a deeper understanding of the sense of touch and its relation to vision. In addition to ultimately enabling visually impaired people to access pictorial information, the research will have an impact on a number of other areas, including virtual reality, interfaces with tactile feedback, product design, and medical applications.

Project Report

The world is increasingly dominated by multimedia technology for communication, commerce, entertainment, art, education, and medicine. Since modern electronic media is rich in graphical and pictorial information, it has been hard for the population of visually impaired people to keep up. In this exploratory project we conducted preliminary research that establishes the feasibility of the presentation of graphical and pictorial information in tactile-acoustic form, by exploiting the capabilities of tactile display devices in combination with the abilities of human tactile and auditory perception. While some of this information can also be presented as speech or Braille text, the direct presentation of graphical and pictorial information in tactile-acoustic form will dramatically increase the amount of information that can be made available to the visually impaired segment of the population. Moreover, advances in the tactile modality and its relation to and interactions with vision and hearing will impact the development of intuitive and natural interfaces for a number of other applications, including virtual reality, multimodal interfaces, and medicine. During this project, we developed mathematical models for tactile devices and perception, as well as algorithms for synthesizing tactile textures. Subjective experiments with with sighted (visually blocked) and visually-impaired people measured the discriminability of tactile and acoustic patterns and explored the brain's ability to perceive simple shapes and simple spatial layout information by integrating tactile and acoustic signals presented on a touch screen. Intellectual Merit: This research contributes to fundamental advances in sense substitution, and the use of touch and sound for human-computer interaction. It addresses fundamental problems in visual, tactile, and acoustic texture analysis and perception, and the use of touch for communication of graphical and pictorial information. It contributes to a deeper understanding of the sense of touch and its relation to vision. Broader Impact: In addition to enabling visually impaired people to access pictorial information, this project will have an impact on a number of other areas, including virtual reality, interfaces with tactile feedback, product design, and medical applications. This project has given the opportunity to both graduate and undergraduates to perform research in this area and has had a strong impact on the maturity and the research, presentation, and communication skills of the students. Finally, research findings have been incorporated in the "Human Perception and Electronic Media" course that the PI teaches at Northwestern University.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1049001
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2010-08-15
Budget End
2012-07-31
Support Year
Fiscal Year
2010
Total Cost
$60,000
Indirect Cost
Name
Northwestern University at Chicago
Department
Type
DUNS #
City
Chicago
State
IL
Country
United States
Zip Code
60611