This project utilizes a holistic, human-centered approach in the design of intelligent driver support systems, where situational criticality estimates are continuously monitored. This requires computational models for accurate estimation of how a driver perceives situations, plans actions, and reacts and interacts with the vehicle and its surround. The specific goal of the project is to develop computational frameworks to analyze attention shifts, using multimodal cues, in an environment where time and safety constraints are critical. Specific research objectives are: (1) Identification of body related indicators of attention switching. This involves utilizing statistical machine-learning algorithms to analyze previously collected ethnographic datasets and determine most useful indicators of attention shifts, including head and eye gaze, hands, feet, and other body motions. (2) Understanding the effect of external visual and audio saliency cues in the driving environment on attention shifts. This involves the analysis of how those multimodal cues affect attention shifts in time- and safety-critical situations, incorporating ?top-down? goal-oriented and ?bottom-up? distraction-based mechanisms. (3) Developing a hierarchical Bayesian model and computational framework for describing the relationship between body cues, external saliency, and driving task, in order to accurately estimate attention and attention shifts. In summary, the project provides a feasibility assessment of detecting how and why attention shifts occur in the vehicular environment with a multimodal sensor suite. Project findings will influence design of active safety systems to reduce crash risk on the roads.

Project Start
Project End
Budget Start
2009-10-01
Budget End
2011-09-30
Support Year
Fiscal Year
2009
Total Cost
$80,000
Indirect Cost
Name
University of California San Diego
Department
Type
DUNS #
City
La Jolla
State
CA
Country
United States
Zip Code
92093