This proposal focuses on extending and codifying the methods and techniques for the evaluation of visual analytics. It is primarily based on benchmarking the Visual Analytics Challenge 2010 (organized through the IEEE Visual Analytics Science and Technology ? known as VAST - symposium contest/challenge) as a test bed to generalize user-centered evaluation to assess the effectiveness of interactive systems that combine analytical reasoning, visual representations, Human Computer Interactions (HCI), complex algorithms, and collaboration tools. At present, there are no standard methods for measuring the performance of interactive visual analytics systems as a whole. Visual analysis systems are not well suited for traditional evaluation methods. For example, simple methods such as precision and recall are not suitable and researchers rarely have had sufficient access to historical data sets with the complexity of the analytical problems that exist across wider, real-time domains. The means of approach to executing this project encompasses three activities: 1) gathering diverse collation of datasets with associated ground truth/problem descriptions, 2) developing automatic metrics and subjective evaluation criteria to assess judgment, and 3) the investigation of the effectiveness and utility of these methods and lessons learned for re-use in other dynamic fields such as business, intelligence, medicine, emergency response, etc.

Agency
National Science Foundation (NSF)
Institute
Division of Advanced CyberInfrastructure (ACI)
Type
Standard Grant (Standard)
Application #
0947358
Program Officer
M. Mimi McClure
Project Start
Project End
Budget Start
2009-09-15
Budget End
2012-08-31
Support Year
Fiscal Year
2009
Total Cost
$150,000
Indirect Cost
Name
University of Maryland College Park
Department
Type
DUNS #
City
College Park
State
MD
Country
United States
Zip Code
20742