A hybrid-reality environment, named PI2, is expected to become an integral and vital part of a long-term vision for complex data analysis at this institution, in effect, a human-computer symbiosis in which humans guide computers to identify features of potential interest that the computer then locates and displays. Developing this vision requires advances in multiple areas, including semi-automatic feature detection, visual representations, and interaction, where traditional display modalities limit what can be displayed and perceived. Moreover, the instrument also facilitates broad interdisciplinary research and provides an innovative teaching and research environment for a diverse student population. It contributes to train future generations of researchers in state-of-the-art interactive visual computing for data analysis and enables broader activities and courses and expands research to outreach new applications (e.g., digital humanity for the American Indian population by working with the Smithsonian Museums). Expectations include: - Advancing multiple avenues of creative inquiry currently blocked or severely restricted will advance rapidly. (The instrument encourages visual thinking among researchers in sciences, healthcare, biomedicine, national security, humanities, and education); - Establishing appropriate levels of technologies needed for different classes of knowledge discovery analysis; and - Assembling a set of research projects to investigate the use of the instrument with the expectation of creating a novel, demonstrably useful, rich, and expressive set of techniques for many cyber-physical and cyber-human systems.

PI2, a hybrid-reality environment, aims to support research in interactive computing and digital humanities. The omni-stereo and mono-modalities of the instrument breaks the traditional barriers between virtual reality (VR) and tiled wall displays. The ability of PI2 to synthesize, capture, create, and analyze visual information in unprecedented detail can transform the way analysts interact with visual information. Leveraging many important characteristics: immersion, hybrid reality, high resolution, large field of view, large space and size, body-centric human-computer interaction, and support for heterogeneous data fusion, it benefits multiple projects in research areas (e.g., brain connectome, woodland ecology, interpersonal experiences, biomedicine, universal access, engineering physics, simulations, systems biology, education, digital humanities, green technologies, and unmanned-vehicle studies). The instrument brings together disparate fields: natural language processing, wearable computing, visualization, data mining, and interaction.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Type
Standard Grant (Standard)
Application #
1531491
Program Officer
Rita Rodriguez
Project Start
Project End
Budget Start
2015-09-01
Budget End
2019-08-31
Support Year
Fiscal Year
2015
Total Cost
$377,994
Indirect Cost
Name
University of Maryland Baltimore County
Department
Type
DUNS #
City
Baltimore
State
MD
Country
United States
Zip Code
21250