Traditional robotics research has adopted a sense-plan-act paradigm in which it is assumed that the sensors are capable of providing enough information in order to decide the next course of action. Humans and animals, however, frequently adopt a different approach, such as shuffling through a pile of unknown objects in order to identify an item of interest hidden beneath the pile. This project explores the concept of interactive perception (or manipulated-guided sensing), in which successive manipulations of objects in an environment are used to increase vision-based understanding of that environment, and vice versa. In particular, the project involves developing appropriate low-order models of highly non-rigid structures such as fabrics and textiles; constructing algorithms to perform real-time vision-based sensing of such objects in cluttered, unstructured environments; and building prototype robotic hardware for testing the resulting models and algorithms. The research forms an integral part of next-generation household service robots performing everyday tasks such as sorting and folding laundry.

Project Report

This project examined automated sensing and manipulation (and their interaction) of non-rigid objects. In particular, we investigated the problem of identifying and handling articles of clothing when arbitrarily crumpled and piled, as found when a dryer is unloaded when doing laundry. This part of the laundry process is particularly difficult to automate due to the unstructured initial arrangement of the clothes, and hard to predict movements of articles of clothing under manipulation - in both cases due to their inherent lack of rigid elements. Our research demonstrated how "interactive perception", or successive operations of manipulating and then sensing non-rigid clothing items, improves the ability to classify individual items (as shirts, pants, shorts, etc.). Using a simple manipulator and inexpensive RGBD sensing (a Microsoft Kinect sensor), we further developed and implemented several algorithms for identification and classification and tracking of flexible objects such as clothing. In support of this work, we built and archived a large indexed database of RGBD clothing images from real laundry baskets across a range of configurations. Further research concentrated on the robotic manipulation of flexible surfaces, and on 3D mapping of environments using RGBD sensors. We additionally conducted basic research demonstrating how sensing of active manipulation of articulated objects can be used to predict and reconstruct how they articulate (i.e. find their hinge and rotation axes).

Project Start
Project End
Budget Start
2010-08-15
Budget End
2014-07-31
Support Year
Fiscal Year
2010
Total Cost
$417,908
Indirect Cost
Name
Clemson University
Department
Type
DUNS #
City
Clemson
State
SC
Country
United States
Zip Code
29634