This EAGER proposal, developing a dense deformable object tracking method for testing and ground-truthing highly dynamic maneuvers of robotic vehicles, offers to prototype a radically new approach to instrumentation for highly agile machines of many types. Using heterogeneous fusion of numerous sensing modalities, the PI proposes to create a high resolution, high bandwidth map of deformable and articulated bodies such as cars with suspensions, humans, legged robots, and such for the performance evaluation and algorithm development for these highly dynamic research artifacts.

Broader Impacts: This work will facilitate a wide variety of research, including studies of agile robotics in dynamic environments. The impacts to robotic science, especially agile and fast moving robots, are clear. The system will also have impact on any other science, such as human motion analysis, ergonomics and medical rehabilitation, where accurate, dense tracking and mapping of deformable scenes is required.

Project Report

Outcomes – Intellectual Merit: This NSF EAGER grant led to a successful a prototype of a motion-capture facility for large-scale dense 3D dynamic and deformable scenes. This EAGER grant contributed to the publication of four papers [1][2][3][4]. The prototype proves that in real-time we can track large volumes on the order of 200m^3 at sub-centimeter resolutions, building highly accurate scene models that can serve as ground-truth. We developed prof-of-concept environment tracking and modeling algorithms for a new type of real-time dense-motion-capture system. The success of this prototype has catalyzed a follow on major research instrumentation grant to development a dense-motion capture system capable of real-time, high-resolution, deformable-scene modeling covering 200m^3 at sub cm accuracies. This instrument is currently being developed under NSF MRI: "Development of Large-Scale Dense Scene Capture and Tracking Instrument" We are interested in fast, agile mobile robots operating in dynamic environments with deformable objects. Imagine fast-flying vehicles avoiding waving branches while navigating through dense moving vegetation; or a troupe of running robots traversing soft, pliable terrain, such as mud, sand or snow – all while estimating, modeling and predicting complex ground interactions. If robots are to move quickly and confidently through such highly-dynamic and deformable environments, then we need to devise better perception, planning and control algorithms. Rigorously testing these algorithms in controlled experiments will require a new kind of motion-capture technology that can provide dense ground-truth measurements of a changing 3D scene. Tracking deformable environments necessitates a live dense-3D map that is updated in real-time. This EAGER grant has directly enabled the development of a system for rapid capture and construction of large dynamic high-resolution virtual environments that duplicate specific real-world environments, including deformable?objects, with unprecedented density of detail. When finished, this will enable a wide variety of new research. Examples include: experimentally validating novel perception, planning and control algorithms of agile mobile robots, particularly those that operate in dynamic environments with deformable objects, require ground truth representation of those environments. Validating computational tools for tether?dynamics and control for flexible multi-body systems require the capture of their configuration?in a large environment. Study of human motion for biomechanics, physical therapy, and exercise science applications requires accurate capture of dynamically changing deformable human shapes?in a large environment. Image-guided surgical procedures require capture of localized dense?patient anatomical surface that are registered to surgical instruments in a larger surgical?environment. In human visual perception and navigation, an accurate, dense model of the surrounding environment (including objects in motion), would dramatically advance the state of eye movement analysis by enabling fast, automated, and objective coding of the objects people see as they move through an environment. The study of foot deformations is enabled by dense-shape-capture during walking and running on real sediments will shed light on the evolution of our uniquely human anatomy and gait, and the biomechanics of barefoot walking and running. Outcomes – Broader Impacts: This EAGER grant developed a prototype for a large-scale, dense motion capture system for dynamic and deformable scenece. The full-scale instrument is now under development and will be housed in the GWU Motion Capture and Analysis Laboratory (MOCA). MOCA represents a collaboration of a dedicated researchers and educators across the university and the greater Washington, DC area in diverse disciplines including anthropology, computer science, psychology, exercise sciences, orthopaedic surgery, dance, physical therapy, and mechanical and aerospace engineering. The core philosophy is to encourage interdisciplinary collaboration within a policy of open and equitable usage. In addition to catalyzing and supporting multi-disciplinary research, MOCA aims to enhance?the quality of high school, undergraduate, and graduate research and education. The GW Department?of Computer Science offers a summer workshop for local high school students in computer graphics that uses MOCA. Courses at GW regularly use the equipment to teach motion capture and provide undergraduate research opportunities in paleoanthropology, psychology, robotics, biomechanics, computer graphics, and mechanical engineering. At the graduate level, MOCA has become a critical component of dissertation training, making possible award-winning doctoral research in diverse disciplines. GW’s School of Engineering and Applied Science and participating PhD programs?in particular have a diverse student population with a percentage of minority and women students far above the national average. [1] Steven Lovegrove, Alonso Patron-Perez, Gabe Sibley, "Spline Fusion: A continuous- time representation for visual-inertial fusion with application to rolling shutter cameras", British Machine Vision Conference, August 2013. [2] Nima Keivan, Gabe Sibley, "Realtime Simulation-in-the-loop Control for Agile Ground Vehicles", Towards Autonomous Robot Systems, August 2013. [3] Gabe Sibley, Nima Keivan, Alonso Patron-Perez and Liz Murphy, "Scalable Perception and Planning Based Control", International Symposium on Robotics Research, December 2013. [4] Liz Murphy, Gabe Sibley, "Incremental Unsupervised Topological Place Discovery", submitted to IEEE international conference on Robotics and Automation (ICRA), 2014.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1249409
Program Officer
Richard Voyles
Project Start
Project End
Budget Start
2012-09-15
Budget End
2013-08-31
Support Year
Fiscal Year
2012
Total Cost
$72,834
Indirect Cost
Name
George Washington University
Department
Type
DUNS #
City
Washington
State
DC
Country
United States
Zip Code
20052