This PFI: AIR Technology Translation project focuses on translating gesture-based 3D modeling to fill a major design to personal manufacturing technology gap. The translated gesture-based 3D modeling has the following unique features: hands-free, intuitive, and on-the-fly interaction that provides transparency in supporting iterative early design thinking, and hands-based visual expression in a natural fashion when compared to the leading competing windows-icons-menus-pointers (WIMP) based 3D modeling tools in this market space.

The project accomplishes this goal by developing a natural-user-interface based gesture recognition and shape interaction technology resulting in a proof of concept prototype for a free-form shape modeling 3D synthetic environment using a depth sensing camera. The partnership engages ZeroUI to provide guidance in the gesture based 3D modeling, user testing with prototypes, and other critical aspects defining future realistic scenarios as they pertain to the potential to translate the gesture-based 3D modeling along a path that may result in a competitive commercial reality.

The potential economic impact is expected to disrupt and tap into a global $200 Billion 3D technology and market applications in the next 3 to 5 years which will contribute to the U.S. competitiveness in this newly emerging gesture-based technology space and contribute to our nations creative design capacity. The societal impact, long term, will be to transform how everyone conceives 3D shapes and creates them thereby enabling personal manufacturing industry.

Project Report

Very few digital tools that transparently support the early design and creative thinking process and allow for natural expression of shapes by augmenting the designer instead of compartmentalizing the design process into procedural sub-processes. Our overarching mission through and beyond accelerating innovation research (AIR) is to enable personal manufacturing for everyone through design. To this end we developed a framework for interacting with virtual shapes with natural hand motions in 3D space using depth cameras. We developed a prototype called z-Pots and tested it widely with users of all kinds. We were able to validate our prototype in demonstrations in major venues. More broadly the culture of visual communication is strongly influenced and affected by the technology and our work will help change this through spatial interaction metaphors and algorithms. New design concepts, methods and tools to support transformative design thinking are a key source for innovation and it is now recognized as the single most important ingredient for growing our economy. In the longer term this body of continuing work developing and translating academic research to real world tools will facilitate a paradigm shift where designing naturally in 3D will become a part of our nations creative design capacity.

Agency
National Science Foundation (NSF)
Institute
Division of Industrial Innovation and Partnerships (IIP)
Type
Standard Grant (Standard)
Application #
1312167
Program Officer
Barbara H. Kenny
Project Start
Project End
Budget Start
2013-05-15
Budget End
2014-10-31
Support Year
Fiscal Year
2013
Total Cost
$154,500
Indirect Cost
Name
Purdue University
Department
Type
DUNS #
City
West Lafayette
State
IN
Country
United States
Zip Code
47907