Although there appear to be many clear benefits to using multimodal human-machine interfaces in terms of increased information transfer, there are also potential performance costs that may result from inter-sensory biases, degraded processing associated with attending to more than one sensory modality, and inhibitory interactions in crossmodal reorienting. The creation of effective and intuitive multimodal interfaces hinges on the ability of the designer to recreate the natural integration between sensory modalities that occurs in the real world (e.g., when we look towards a voice in a crowd). Unfortunately for the interface designer, the conditions that distinguish between "good" and "bad" inter-sensory interactions have not been clearly elucidated. There remain many basic research questions in the realm of sensory integration that must be answered to optimize multimodal human-machine interfaces. In this project the PIs will conduct a series of basic psychophysical experiments to investigate several crucial outstanding issues in the development of multimodal human-computer interfaces. First, the spatial and temporal parameters needed to achieve faster response times will be identified through a systematic parametric investigation of crossmodal interactions using visual, auditory and tactile signals. Second, because for many human-machine interfaces it is necessary to present signals from distinct areas of the display, the PIs will explore the crossmodal mapping between proximal and distal surfaces by extending their pilot research which has demonstrated that tactile cues presented to a user's back can be used to speed responses to visual targets presented on a monitor in front of the user. Knowledge gained from the aforementioned experiments will be applied to the development of a novel haptic directional display for automobile driver assistance, whose effectiveness will be tested in a series of driving simulator experiments.

Broader Impacts: This project will significantly advance our understanding of human multi-sensory integration, and will provide a framework that can be used to guide both future research and the engineering of multimodal human-machine interfaces. The haptic driver assistance display to be developed will help transform automobile information technology, reducing the serious problem of driver distraction.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
0533908
Program Officer
Ephraim P. Glinert
Project Start
Project End
Budget Start
2005-11-01
Budget End
2009-10-31
Support Year
Fiscal Year
2005
Total Cost
$481,052
Indirect Cost
Name
Arizona State University
Department
Type
DUNS #
City
Tempe
State
AZ
Country
United States
Zip Code
85281