A novel method is developed to convert long video sequences from a single camera into 3D panoramic visualizations. The approach holds the promise to work at near real-time on a laptop. The 3D content allows users to change view angles and it also supports panoramic views. The system has two real-time software components: (1) the image matcher that generates multi-view panoramic mosaics from 2D image sequences; and (2) the 3D renderer that can provide an end user a virtual fly-through or walk-through of a scene using commercially available 3D displays. The approach promises real-time 3D visualization for a broad range of camera types (high end/low end color, IR, even gamma-ray/r-ray imaging), camera motions (linear, circular), motion platforms (airborne, ground vehicle, hand-held), and scene distances (from a few inches to thousand feet).

This technology has matured from research into environmental monitoring, security and defense applications, where video was captured from a single camera on board an aircraft or ground vehicle. The system is able to produce 3D visualizations of the videos for further video data analysis, scene understanding and scientific discovery. The technology has applications not only in defense and environmental monitoring, but also in entertainment, medical imaging, inspection, real estate and land survey/development industries. There are many 3D ready devices, from TVs, to smartphones in the consumer market already. But the content is still missing. To address this unmet need, the team is building a smartphone application to enable user-generated 3D content, without requiring a 3D camera.

Project Report

Over the course of our research in the past decade under the support of several NSF grants, a novel method has been developed to convert long video sequences from a single camera into 3D panoramic visualizations with our unique approach, called Parallel Ray Interpolation for Stereo Mosaicing (PRISM). With this I-Corps project, our goals are to learn the type of business models that fit with our technologies and capabilities, make a go/no-go decision for technology transfer, and/or to make changes to the technological and business plans. The I-Corps project has great impact in equipping both the PI and the entrepreneurial lead (EL) with entrepreneurship mindset for bringing NSF sponsored research out of the lab. The experience of getting out of the building to talk with customers led us to meet with over 90 customers in 6 weeks. This was obviously made very plausible by the NSF I-Corps support, and through the mentoring and feedback from the teaching team and peers, we received a very rich learning experience. The EL has been better prepared to start up a business and implement entrepreneurship plans, and the PI has obtained significant hands-on experience in advising and training more students in entrepreneurship through both his senior design course and extracurricular advising. The impact of the project on technology transfer is the core of the project. The impact is not just transfer of the proposed technology of 3D viewing experience - the PRISM technology, but also other related technologies, most of them developed under NSF grant supports. In fact evaluating the readiness of the proposed technology for commercialization and making a go/no-go decision is the major goal of the program. Right now we will wait the best time for our interactive 3D viewing technology (you may watch a YouTube video for the PRISM technology): we made a no go on a company, but we are open to working with partners and potentially licensing the technology. However, we have discovered that the interactive 3D experience with alternative perception for the blind – the VISTA technology, which is sponsored by an NSF EFRI grant, is more demanded by users and we have made a GO decision for this technology (please watch a video of the VISTA technology and a user test). The EL has been starting to prepare a small business with other technological and business partners for related alternative 3D "display" technologies for visually impaired people, with PI as technical advisor. This I-Corps effort has also started to produce profound impact for the PI and students in performing scientific research in the principal disciplinary fields of the project, which include machine intelligence, computer vision and human-computer interaction. The impact starts with problem definition, algorithm design principles, to usability study and customer analysis of a research project. We found that the customer discovery process was not only great at finding unaddressed customer needs, but also useful at identifying under researched problems that have real world applications. We will have a new set of perspectives to identify research problems and to perform scientific research for finding solutions.

Agency
National Science Foundation (NSF)
Institute
Division of Industrial Innovation and Partnerships (IIP)
Type
Standard Grant (Standard)
Application #
1243737
Program Officer
Rathindra DasGupta
Project Start
Project End
Budget Start
2012-07-01
Budget End
2013-08-31
Support Year
Fiscal Year
2012
Total Cost
$50,000
Indirect Cost
Name
CUNY City College
Department
Type
DUNS #
City
New York
State
NY
Country
United States
Zip Code
10031