Physically Realistic Virtual Surgery Abstract While virtual reality (VR)-based surgical simulation technology is being developed to improve laparoscopic surgical training outside the operating room (OR), existing simulators focus mostly on technical skills (TS) of hand-eye coordination for isolated tasks and seldom on non-technical skills (NTS) associated with both cognitive skills of decision making, as well as interpersonal skills of communication, team-work and conflict resolution. To enable VR-based surgical simulators to also train for cognitive skills, in the previous grant period we successfully developed the next generation (Gen2) of laparoscopic surgical simulators that immerse the trainee in a virtual OR using a head-mounted display (HMD) system, and introduce distractions, interruptions and other stressors to capture the high-stress environment of the real OR. However, to the best of our knowledge, there exists no VR-based simulator for training interpersonal skills needed for the multidisciplinary integration of OR teams, which consist of surgeons, anesthesiologists, and perioperative nurses. Following the significant reduction of adverse events in other disciplines, such as aviation, by the introduction of mandatory simulation-based team training (e.g., crew resource management), the National Surgical Skills Curriculum developed by the American College of Surgeons (ACS) and Association of Program Directors in Surgery (APDS) has prescribed ten team-based training modules to be performed in a simulation facility (e.g., an OR endosuite) with scenario-based training on high- fidelity manikin simulators. However, such facility-based team training is extremely expensive and cumbersome, requires dedicated facility and faculty time, and entails significant planning and schedule coordination between trainees, technicians, and faculty. To overcome the challenges of facility-based OR team training, the goal of this project is to extend the immersive VR technology (Gen2) developed as part of our prior grant for a single user to the entire OR team, and harness recent advances in cloud computing, mobile device-based VR and artificial intelligence and machine learning to design, develop and evaluate a Virtual Operating Room Team Experience (VORTeX) simulation system. The VORTeX will allow the OR team to train together in a distributed fashion (i.e., not co-located in the same room or simulation facility) wearing mobile device-based HMD systems to develop further their NTS based on computer-generated simulation scenarios replacing the physical ones. Evaluation of the simulation scenarios will be performed asynchronously by a team of experts based on post-action replays. We will implement the VORTeX for a laparoscopic cholecystectomy crisis scenario, developed and validated by our Co-I Dr. Dan Jones at BIDMC and adopted as one of the team training modules of the ACS/APDS national surgical skills curriculum. We hypothesize that the VORTeX will be at least as good as or better than traditional facility-based simulation in providing non-technical skills training to OR teams.

Public Health Relevance

The goal of this research is to develop and validate a comprehensive computer-based technology that will allow surgical trainees to practice their surgical skills on computer-based models. Surgical procedures and techniques, learnt and perfected in this risk-free manner before application to patients, will translate to fewer operating room errors, reduced patient morbidity and improved patient outcomes resulting in faster healing, shorter hospital stay and reduced post surgical complications and treatment costs.

Agency
National Institute of Health (NIH)
Institute
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Type
Research Project (R01)
Project #
2R01EB005807-09A1
Application #
9969824
Study Section
Bioengineering, Technology and Surgical Sciences Study Section (BTSS)
Program Officer
Peng, Grace
Project Start
2006-06-01
Project End
2024-01-31
Budget Start
2020-05-01
Budget End
2021-01-31
Support Year
9
Fiscal Year
2020
Total Cost
Indirect Cost
Name
Rensselaer Polytechnic Institute
Department
Engineering (All Types)
Type
Biomed Engr/Col Engr/Engr Sta
DUNS #
002430742
City
Troy
State
NY
Country
United States
Zip Code
12180
Nemani, Arun; Ahn, Woojin; Cooper, Clairice et al. (2018) Convergent validation and transfer of learning studies of a virtual reality-based pattern cutting simulator. Surg Endosc 32:1265-1272
Karaki, Wafaa; Rahul; Lopez, Carlos A et al. (2018) A Two-Scale Model of Radio-Frequency Electrosurgical Tissue Ablation. Comput Mech 62:803-814
Sankaranarayanan, Ganesh; Wooley, Lizzy; Hogg, Deborah et al. (2018) Immersive virtual reality-based training improves response in a simulated operating room fire scenario. Surg Endosc 32:3439-3449
Cetinsaya, Berk; Gromski, Mark A; Lee, Sangrock et al. (2018) A task and performance analysis of endoscopic submucosal dissection (ESD) surgery. Surg Endosc :
Nemani, Arun; Kruger, Uwe; Cooper, Clairice A et al. (2018) Objective assessment of surgical skill transfer using non-invasive brain imaging. Surg Endosc :
Han, Zhongqing; Rahul, Suvranu De (2018) A Multiphysics Model for Radiofrequency Activation of Soft Hydrated Tissues. Comput Methods Appl Mech Eng 337:527-548
Dorozhkin, Denis; Olasky, Jaisa; Jones, Daniel B et al. (2017) OR fire virtual training simulator: design and face validity. Surg Endosc 31:3527-3533
Ye, Hanglin; De, Suvranu (2017) Thermal injury of skin and subcutaneous tissues: A review of experimental approaches and numerical models. Burns 43:909-932
Demirel, Doga; Yu, Alexander; Baer-Cooper, Seth et al. (2017) Generative Anatomy Modeling Language (GAML). Int J Med Robot 13:
Dargar, Saurabh; Akyildiz, Ali C; De, Suvranu (2017) In Situ Mechanical Characterization of Multilayer Soft Tissue Using Ultrasound Imaging. IEEE Trans Biomed Eng 64:2595-2606

Showing the most recent 10 out of 112 publications