We aim to reduce surgical robotic errors by developing novel technology to coach experienced practitioners by using real-time data-driven predictive models of operator behavior, task dif?culty, and expertise levels during complex surgical training tasks. This technology could increase the effectiveness of simulation-based training, particularly for practicing clinicians, as the predictive models will inform the design of adaptive and personalized feedback for the surgeon. Surgical training typically involves didactic learning, skills labs, and practice on live patients. Safety concerns asso- ciated with training on patients has led to signi?cant developments in simulation-based technology; however, existing simulators may lack the ability promote mastery of skills for practicing providers. Improved training is important for both the provider and the patient. An estimated 100,000 death per year occur due to preventable medical errors. In robotic surgeries, the majority of patient injuries can be attributed to inexperience and lack of technical competence of the attending surgeon. These errors could potentially be avoided through personalized and adaptive coaching. In general, robotic systems can sense and adapt to their environment, even act autonomously to complete a task. However, the majority of surgical robots used today are ?teleoperated systems. These systems only perform tasks directly commanded by the human operator, possibly with some scaling or tremor cancellation. There is a missed opportunity to leverage the intelligence of robotic systems to sense and interpret the movements of the surgeon and to enable some form of adaptive feedback for personalized coaching. Our prior work in human-centric modeling could hold the key to the technical challenge of integrating intelligent methods into existing surgical robotic training platforms by better understanding the technical strengths and weaknesses of the practicing surgeon in a data-driven manner. The long-term goal of this project is to improve surgical training outcomes by developing a personalized and adaptive surgical robotic coach capable of providing meaningful feedback to the practicing provider to optimize learning and skill transfer. The speci?c aims of the proposal include: (1) evaluate the ability of human- centric models to characterize surgeon performance using motion and video data, (2) design adaptive haptic or visual guidance cues to provide learners with real-time feedback and to optimize learning, and (3) evaluate the effectiveness of the adaptive technology coach through end-user validation using procedural-speci?c training models for general surgery, urology, and gynecologic oncology. This project could signi?cantly improve provider training in robotic surgery. The project could also improve provider training for laparoscopic and open surgery as the models used to develop the virtual coach are inherently human-centric and not tied to any speci?c surgical tasks or surgical platforms. Our team is uniquely positioned to achieve success in this project, bringing together experts in surgical robotics, human-centric modeling, machine learning, and advanced surgical training. We have conducted extensive preliminary studies in areas related to this proposal, supporting feasibility of this project. Our integration with the Simulation Center at UTSW will enable translation of successful outcomes of this project into the surgical training and retraining pipeline.

Public Health Relevance

Human-generated, preventable errors, particularly those made intra-operatively, can lead to morbidity and mortality for the patient and high costs for the hospital. While simple inanimate trainers have been successfully developed for surgical resident education, there is a lack of simulation-based technology for practicing surgeons. This research leverages novel data-driven methods to quantify how, and how well, a surgeon is performing a task for the long-term goal of developing personalized, adaptive virtual coaches for surgical robotic systems, resulting in better patient outcomes.

Agency
National Institute of Health (NIH)
Institute
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Type
Research Project (R01)
Project #
1R01EB030125-01
Application #
10037429
Study Section
Biomedical Computing and Health Informatics Study Section (BCHI)
Program Officer
Peng, Grace
Project Start
2020-09-18
Project End
2024-06-30
Budget Start
2020-09-18
Budget End
2021-06-30
Support Year
1
Fiscal Year
2020
Total Cost
Indirect Cost
Name
University of Texas Austin
Department
Engineering (All Types)
Type
Biomed Engr/Col Engr/Engr Sta
DUNS #
170230239
City
Austin
State
TX
Country
United States
Zip Code
78759