The objective of this project is to test whether a framework proposed by the PIs can in near real-time read, write, and receive feedback from a model which fuses pictures from mobile devices and Building Information Models for the purpose of providing ubiquitous and marker-less contextual awareness for Architecture/ Engineering/ Construction and Facility Management (AEC/FM) applications. According to the framework, field personnel can use mobile devices to take pictures that include specific project elements (e.g., column), touch or click on the elements in the image, and be presented with (or be able to add) a detailed list of information, such as architectural/structural plan related to the physical elements. The mobile device can use onboard GPS and other sensors to perform a rough calculation of the device's field-of-view and location. Initial image processing is done on the mobile device to extract and send feature points/descriptors, field-of-view, and location to the Hybrid 4-dimensional Augmented Reality (HD4AR) server. Based on a new computer vision method, the server uses this information from the phone to derive the mobile device's position at a resolution that is an order of magnitude more accurate than with current approaches based solely on GPS. The server uses the derived high-precision camera position to determine what cyber-information is in view of the device's camera. The extracted information, along with pixel coordinates of where each cyber-information item should appear in the photo, is returned to the mobile device and visualized in augmented reality format.

If successful, the results of this research will provide the first feasible platform for context aware applications which does not require reliable and high accurate GPS/sensor-based location and orientation tracking and works based on existing image collections. It further assists field personnel through visualization of queried plan and actual site information in form of augmented reality, and supports interactions among project personnel and field information. By providing immediate access to information, the proposed framework automatically provides inexpensive, global and frequent reports from the field activities, and in turn can reduce downtime, rework, waste, and ultimately cost overrun. This project also involves educational and outreach activities to promote teaching and learning, engage undergraduate and graduate students, and reach out to underrepresented groups, K-12 students, and industry professionals. These activities include development of two course modules of "visual sensing for civil infrastructure engineering and management" and "mobile cyber-physical systems," as well as creating new software tools and hands-on outreach materials for context aware AEC/FM applications, which will be widely distributed among research and professional communities.

Project Start
Project End
Budget Start
2012-08-15
Budget End
2013-10-31
Support Year
Fiscal Year
2012
Total Cost
$299,961
Indirect Cost
City
Blacksburg
State
VA
Country
United States
Zip Code
24061