Intellectual Merit: Human-like dexterous manipulation is featured prominently as a grand challenge in the 2009 Roadmap for U.S. Robotics' report. Human dexterity relies heavily on tactile sensation and is influenced by proprioceptive and visual feedback. The proposed work aims to advance artificial manipulators by integrating a new class of multimodal tactile sensors with anthropomorphic artificial hands and developing generalizable routines for context-driven haptic inquiry of objects based on task requirements for artificial grasp and manipulation. A primary goal is the development of capabilities for a robot hand to efficiently learn about objects in its unstructured environment through touch, specifically for cases where computer vision would fail to provide critical information about the physical hand-object interactions. While computer vision provides preliminary information about an object and its environment, vision alone cannot provide all essential information necessary for successful physical hand-object interactions. This is especially true when digits are occluded by the grasped object, and when the hand-object interaction is completely out of view. Inspiration for the haptic inquiry framework will be drawn from a suite of human haptic exploration procedures. In contrast to haptic exploration, haptic inquiry will require that the order and time spent on each exploratory procedure depend on task goals. The order and type of questions to be asked haptically will be context-dependent and designed to yield high-level, task-directed information at a low cost of inquiry. The weight given to each mode of tactile sensing (force, vibration, temperature) will also be tuned according to the context of the task. This proposal aims to strengthen the robustness of co-robot systems by developing a framework for context-driven, task-directed haptic inquiry that integrates multi-digit tactile and proprioception data in a task-appropriate manner. The framework will be developed and deployed on an anthropomorphic robot hand outfitted with a new class of commercially-available multimodal tactile sensors. The work is transformative because it will enable co-robot systems to remain functional even in the absence of visual feedback, which is typically the primary form of feedback for robotic systems. The long-term research objective of this proposal is to reduce the cognitive burden on the user of an artificial manipulator.

Broader Impacts: The proposed translational research could enhance the functional capabilities of co-robot systems in which humans use artificial manipulators to work in unstructured, unsafe, or limited access environments (prosthetic, rehabilitative, assistive, space, underwater, military, rescue, surgery). The proposed work could benefit the human user of a co-robot system by empowering the robot with the ability to control low-level perception-action loops autonomously without burdening the human. The ROS operating system may be used to simulate and control an anthropomorphic robot hand outfitted with commercially-available tactile sensors using commercially-available actuators. Custom source code (C, MATLAB, ROS) and an open source haptic library for a commercially-available tactile sensor (suitable for data mining) will be made publicly available for the benefit and advancement of the robotics community.

Project Start
Project End
Budget Start
2014-07-01
Budget End
2018-09-30
Support Year
Fiscal Year
2014
Total Cost
$454,632
Indirect Cost
Name
University of California Los Angeles
Department
Type
DUNS #
City
Los Angeles
State
CA
Country
United States
Zip Code
90095