This project seeks to develop tactile sensing technology that emulates many qualities of human skin. The goal is a sensor that can determine the texture and shape of objects that it touches, as well as the forces distributed across the surface. The new sensor is made of a block of clear elastomer, with a compliance similar to that of the human fingertip, covered with a flexible reflective skin. A small light source and a camera are embedded in the device. When an object contacts the skin, the surface is distorted, leading to a change in the reflected light pattern. Machine vision techniques convert the patterns into estimates of the forces on the skin. The project is testing a number of optical and mechanical designs, and is developing the corresponding image analysis techniques, in order to characterize and optimize the performance. Because the sensor is compliant, it can be built into a human-like robotic finger, providing gripping surfaces that are mechanically stable as well as highly sensitive. The new technology may also be useful in medical applications such as minimally invasive surgery, where it is important for the surgeon to sense the mechanical properties of the tissues that are being explored.

Project Report

The purpose of this project was to develop improved touch sensors for robotic fingertips. For many applications, the ideal fingertip should have these characteristics: It should be soft like a human finger, so that the robot can manipulate objects of various shapes; it should be sensitive to small forces; it should have high spatial resolution, so it can discriminate different geometries and textures. Current sensors fall short of the human fingertip in various ways. The new type of sensor developed in this project uses a technology called "GelSight" that can match the human fingertip in many ways, and has spatial resolution far higher than that of human skin. A GelSight sensor is made of a clear elastomer covered by an opaque reflective membrane. A camera and an illumination system are situated inside the finger, and they look through the elastomer at the membrane. When the fingertip contacts an object, the shape of the membrane is distorted, and the camera measures the distortion using methods from machine vision. By using a technique called photometric stereo, it is possible to resolve the membrane’s shape in great detail, resolving features finer than a human hair. The sensor is also able to detect small variations in the distortion pattern, and this allows its use in lump detection (such as in detecting tumors). The sensor can detect lumps with a sensitivity exceeding that of a human fingertip. Another important capability is measuring tangential forces, also called shear forces, since they arise from the friction between the fingertip and the object being grasped or manipulated. The pattern of shear forces allows the robot to tell when an object is about to slip from the grasp; the robot can respond by tightening the grip to prevent the slip. Most touch sensors are unable to measure these forces, but by tracking an array of markers on the membrane surface, the GelSight sensor can measure the distribution shear forces across the surface of the fingertip. The sensor also supports the task of recognition and alignment of object shapes. This helps the robot plan and execute movements requiring accurate knowledge of the grasped objecct’s position and orientation

Project Start
Project End
Budget Start
2010-09-01
Budget End
2014-08-31
Support Year
Fiscal Year
2010
Total Cost
$450,000
Indirect Cost
Name
Massachusetts Institute of Technology
Department
Type
DUNS #
City
Cambridge
State
MA
Country
United States
Zip Code
02139