The proposed research aims to understand fundamental surgical interactions so that technologies can be improved to make the interactions as consistent and intuitive as possible across interfaces. Teleoperative robotic systems and virtual environment simulations can make it easier to learn and perform surgery. With current technology, however, robotic surgery feels different from open surgery, which is quite different from minimally invasive surgery, and simulated procedures often fail to behave realistically like any of these.
Two of the most difficult aspects of minimally invasive techniques are limited haptic feedback and working dexterously under the kinematic constraints of the instruments. The most time-consuming portions of surgical procedures are typically dissection, which is hampered by poor haptics, and repair by suturing, which is made challenging by limited dexterity. These two skills will be studied in parallel thrusts. In each thrust, the skill will be modeled from experimental data. The models will be tested in simulation and robotic testbeds, then used to optimize the design of teleoperative surgical systems and virtual training environments.
This research is expected to have significant impact on the technology and teaching of minimally invasive surgery, allowing wider usage of beneficial techniques while improving medical outcomes and reducing errors.