Information Technology is making great progress in the operating room. In the practice of minimally invasive surgery, three-dimensional modeling techniques are enabling preoperative planning, new image acquisition and display techniques, and superior dexterity through teleoperated robotic systems. However, one sensory channel is presently ignored in these technological improvements: touch. In manual surgeries, haptic feedback is crucial for palpation, suture manipulation, and detection of puncture events. Sensation of forces is particularly important for invasive tool-tissue interaction tasks such as grasping, cutting, dissection, and percutaneous (GCDP) therapy. Lack of haptic information overloads the visual sensing required of the surgeon, requiring a demanding level attention. For information technology to truly enhance the practice of surgery, multiple sensory channels must be utilized.

We propose to develop instrumentation and algorithms for modeling the haptic aspect of tool-tissue interaction in four common surgical tasks, namely: grasping, cutting, dissection, and percutaneous therapy. This system will significantly enhance information display in three ways. First, data acquired in real time will be used to provide feedback to the surgeon during model-based teleoperated procedures, increasing the transparency. of the robot-assisted surgical system. Second, real-time modeling techniques will enable model-based teleoperation, removing the strict constraints imposed by time delays in traditional, direct teleoperation. Third, realistic surgical simulations will improve training, increasing surgeon competence and patient safety. The instrumentation and modeling algorithms will be used to determine parameter values for different tissue types, particularly liver, prostate, spleen, and kidney. Ex-vivo tissues and phantom hydrogel tissues will be used in these experiments. Once the instrumentation and modeling techniques are developed, we will proceed to extensive validation of the three applications of enhanced information display. Performance experiments will verify improvements in accuracy and precision for direct and model-based teleoperation, and a computer vision/force sensing system will determine the realism of force and deformation models developed for surgical simulation. This proposal addresses the ITR challenge for developing an information-enhanced display with computational, simulation, and data analysis methods for modeling common surgical GCDP tasks for which currently there is no systematic model.

Intellectual merit: The specific goal of this project is to significantly improve the information-enhanced operating room through the sensing and acquisition of models representing haptic information during tool-tissue interactions in minimally invasive surgery. This research will significantly impact: (a) tool-tissue interaction models that reflect the actual forces and deformation occurring during surgery, (b) haptic feedback to the surgeon during direct or modelbased teleoperation, thereby improving surgical outcomes, c) the development of realistic simulations for training current and future health care professionals, and d) the development of instrumented smart tools for both traditional minimally invasive and robot-assisted surgeries.

Broader Impacts: The PIs from both institutions have an excellent history of involving undergraduate students (both men and women) in research projects through Research Experience for Undergraduates (REU). Additionally, the Drexel Research Experience for Teachers (RET) site proposal recommended for funding by NSF, along with an established RET program at Johns Hopkins, will involve high-school math and science teachers from high schools to participate in summer research projects related to this proposal. These activities will enhance participation of underrepresented groups from inner-city schools and lead to broader dissemination of scientific and technological education of the students and teachers. For graduate students, the interdisciplinary nature of this research offers new opportunities in education, broadening the interaction of mechanical engineers and medical professionals. Finally, the proposed research will lead to improvements in surgical outcomes, benefiting the patient and the society at large.

Due to the diverse areas of research required by this project, the proposed research will benefit from the collaboration of PIs from Drexel and JHU. Drexel has expertise in haptics, FEM modeling, grasping/dissection tasks, and phantom tissues while JHU has expertise in reality-based modeling for cutting/percutaneous therapies, haptics, and validation experiments for human-machine systems. The unique facilities and collaborations at both institutions, along with strong ties to medical professionals, make this work appropriate as a collaborative research project.

Project Start
Project End
Budget Start
2003-08-15
Budget End
2006-07-31
Support Year
Fiscal Year
2003
Total Cost
$177,794
Indirect Cost
Name
Johns Hopkins University
Department
Type
DUNS #
City
Baltimore
State
MD
Country
United States
Zip Code
21218