This project, developing a highly reconfigurable collaborative visualization instrument named Enodia, aims to support virtual reality (VR) and 2D content, and multiple interaction modalities (supporting very different physical display constellations) at the institution and within satellite nodes at several collaborating colleges and universities. Enodia operationalizes the idea that different activities call for different physical arrangements of tools, workspaces, and people for intensive research engagement with screen mediated content, especially for collaboration and communication-oriented research activities. It builds upon groundbreaking hybrid tangible multitouch and gestural interaction support and allows highly diverse room-scale display geometries to be reconfigured within minutes/seconds, flexibly driving many displays in diverse locations across several campuses (outreach partners). It also permits its compute cluster to be back-filled at lower priority.
The approach allows for several large stage screens of the anchor visualization environment and multiscreen deployments at a flagship and several regionally and nationally distributed satellite locations (all driven by a single computer cluster) to be physically reconfigured in minutes and even seconds. These configurations include wall(s), nooks, architectural mockups, and room partitioning into multiple independent subareas, to support diverse scientific domains and use contexts. The prioritized, layered VM approach supports both local and distributed use, both of collaborative and independent nature.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.