The activity of visualizing and exploring complex multi-dimensional data provides insight that is essential for progress in several areas of science and engineering, where the amount and complexity of the data overwhelms traditional computing environments. This project will develop the first complete version of the Allosphere, an infrastructure that will provide powerful methods for detailed analysis, synthesis, and manipulation of such data by integrating multimodal representations of large-scale data with human-scale visualization and interaction techniques in a novel immersive environment.

Intellectual Merit: The Allosphere comprises a 10m diameter spherical display surface in a three-story near-anechoic building space, with a bridge running through the center that holds up to 25 participants. When fully equipped, the Allosphere will include high-resolution stereo video projectors to illuminate the complete spherical display surface, a large array of speakers distributed outside the surface to provide high quality spatial sound, a suite of sensors and interaction devices to enable rich user interaction with the data and simulations, and the computing infrastructure to enable the high-volume computations necessary to provide a rich visual, aural, and interactive experience for the users. The Allosphere will be one of the largest visualization/exploration instruments in the world, and it will serve as an ongoing research testbed for several important areas of computing. In addition, the Allosphere will serve as an environment for experimental media creation and performance, as a tool for scientific discovery in areas such as nanosystems, neuroscience, quantum computing, and biochemistry, and as an instrument for education and outreach.

Broader Impacts: This infrastructure project will complete the audio, interaction, and computational infrastructure of the Allosphere. This first version will serve as a computing research infrastructure in two primary ways: (1) as a platform for driving computing research needed to create future immersive multimodal, multimedia visualization environments, raising challenging problems in storage, networking, rendering, software virtualization, real-time simulation, and human-computer interaction; and (2) as an immersive visualization environment for research in computing areas such as scientific and information visualization, knowledge discovery, visual analytics, complex design, large-scale performance debugging, data mining, and cloud computing. As infrastructure for both computing and scientific exploration, the Allosphere will benefit a number of important fields that have wide impact on society. The facility will be made available to researchers and partners (in academia, industry, and government) beyond UCSB. The project will also leverage local outreach programs to attract K-12 students, underrepresented groups, and undergraduate students, and involve them in scientific projects and explorations that utilize this unique immersive environment. The facility will be integrated into courses and research projects in several departments at UCSB. Results will be disseminated via the project Web site (http://allosphere.ucsb.edu).

Project Report

The project helped to develop the AlloSphere, a large, multimodal immersive instrument that is enabling the development and application of powerful methods for visualizing, exploring, and evaluating complex multi-dimensional data to gain insight in areas of science, engineering, and computing in which the size and complexity of the data overwhelm traditional computing and display environments. The first complete multimodal version of the AlloSphere was built, bootstrapping its use as a platform for research and an instrument for exploration and discovery. The facility has also served as an important vehicle for education and outreach at UC Santa Barbara. Fully immersive, multi-user, and interactive, the AlloSphere instrument serves a multitude of research initiatives that integrate the life and physical sciences, design, and arts research, enabling a platform for collaboration across disciplines reaching out to the general public. The ability to see, hear, and interact with complex information, allowing a team of interdisciplinary researchers to collaborate across physics, chemistry and biology, as well as understanding the social, geographic and economic variables of a complex system, can give an holistic view of critical problems that need to be addressed in the world, from global climate change, to new materials synthesis, to discoveries facilitating targeted medicine. Facilitating this development with a language that allows humans to process this information through their senses is enabling broader public access to this knowledge.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Application #
0855279
Program Officer
Maria Zemankova
Project Start
Project End
Budget Start
2009-09-15
Budget End
2014-08-31
Support Year
Fiscal Year
2008
Total Cost
$749,894
Indirect Cost
Name
University of California Santa Barbara
Department
Type
DUNS #
City
Santa Barbara
State
CA
Country
United States
Zip Code
93106