Data overload, especially in the visual channel, and associated breakdowns in monitoring already represent a major challenge in data-rich environments. One promising means of overcoming data overload is through the introduction of multimodal displays which distribute information across various sensory channels (including vision, audition, and touch). In recent years, touch has received more attention as a means to offload the overburdened visual and auditory channels, but much remains to be discovered in this modality. Tactons, or tactile icons/displays, are structured, abstract messages that can be used to communicate information in the absence of vision. However, the effectiveness of tactons may be compromised if their design does not take into account that complex systems depend on the coordinated activities of a team. The PI's goal in this project is to establish a research program that will explore adaptive collaborative tactons as a means to support situational awareness, that is the ability of a team to perceive and comprehend information from the environment and predict future events in real time. Project outcomes will contribute to a deeper understanding of perception and attention between and across sensory channels for individuals and teams, and to how multimodal interfaces can support teamwork in data-rich domains.
The work will integrate three disparate topics within human factors: multimodal interfaces, situational awareness, and adaptive systems. The PI will create methods to design tactons that take into account both context and the types of information needed by a team, by leveraging the multimodal aspects to develop quantitative and qualitative models and algorithms using physiological measures (in particular, eye tracking data). These in turn will inform the functionality of adaptive tactons that support collaboration by adjusting the presentation of information in response to various sensed parameters and conditions.