"Electrosurgery" is now becoming universally accepted as the technique of choice in most minimally invasive surgical (MIS) procedures for achieving a variety of tissue effects ranging from dissection to hemostasis (control of bleeding) using high frequency electrical energy. However, there exists no standardized curriculum or training regimen outside the operating room (OR), for the surgical community to safely and effectively use the complex electrosurgical instruments. It is anticipated that a virtual reality (VR)-based trainer, with visual and haptic (touch) feedback, will be invaluable for electrosurgical skill training, allowing the trainees to attain competence in a controlled environment that does not expose actual patients to the bare brunt of their "learning curves";customization based on individual needs;and real time feedback, mentoring and objective assessment without the need for a proctor. While a few VR-based trainers exist for laparoscopic psychomotor skill training (i.e., training for hand-eye coordination and motor skills necessary for tasks such as tool movement, cutting, suturing, etc), none exists specifically for electrosurgical procedures as major technological hurdles must be overcome, including (1) realistic physics-based modeling of the complex bio- physics of tissue cutting, hemostasis and tissue joining;(2) physical in vivo experiments to determine tissue parameters and support modeling and validation;and (3) novel realistic VR interfaces. The goal of this project is to overcome these technological barriers and design, develop and evaluate the first Virtual Electrosurgical Skill Trainer (VEST). To accomplish the goals of the project, a multidisciplinary team has been assembled to achieve the following Specific Aims: (SA1) Develop physics-based computational technology for modeling, in real time, the interaction of electrosurgical devices with soft tissue: Specifically, we will develop physics-based computational models of electrosurgical tissue cutting, joining and hemostasis based on in vivo studies. Novel computational algorithms will be developed to allow real time performance. (SA2) Design and develop a realistic VEST platform: We will integrate the computational models and experimental data generated in SA1 and develop the prototype of the VEST with training scenarios for (1) tissue dissection;(2) arc fulguration and (3) coaptive vessel closure. VEST will include real time feedback identifying errors;visual, auditory and haptic cues to guide the trainee;display of physiological consequence of surgical complications;effects of alternate surgical procedures and devices as well as automatic real time assessment of surgical skill. (SA3) Establish the validity of the VEST as a training tool. We will conduct experiments at the Skills Lab at BIDMC to ensure that the tasks in VEST reflect the technical skills in electrosurgery, and the scores measured in VEST are the appropriate performance metrics in assessing training. (SA4) Evaluate the usefulness of the VEST as a training tool. By dividing subjects into practice and non-practice groups we will study whether training on the VEST transfers positively to the OR.

Public Health Relevance

The goal of this research is to develop and validate a comprehensive computer-based technology that will allow surgical trainees to practice their surgical skills on computer-based models. Surgical procedures and techniques, learnt and perfected in this risk-free manner before application to patients, will translate to fewer operatin room errors, reduced patient morbidity and improved patient outcomes resulting in faster healing, shorter hospital stay and reduced post surgical complications and treatment costs.

National Institute of Health (NIH)
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Research Project (R01)
Project #
Application #
Study Section
Special Emphasis Panel (ZRG1-SBIB-Q (80))
Program Officer
Peng, Grace
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
Rensselaer Polytechnic Institute
Engineering (All Types)
Schools of Engineering
United States
Zip Code
Nemani, Arun; Sankaranarayanan, Ganesh; Olasky, Jaisa S et al. (2014) A comparison of NOTES transvaginal and laparoscopic cholecystectomy procedures based upon task analysis. Surg Endosc 28:2443-51
Olasky, Jaisa; Chellali, Amine; Sankaranarayanan, Ganesh et al. (2014) Effects of sleep hours and fatigue on performance in laparoscopic surgery simulators. Surg Endosc 28:2564-8
Lu, Zhonghua; Arikatla, Venkata S; Han, Zhongqing et al. (2014) A physics-based algorithm for real-time simulation of electrosurgery procedures in minimally invasive surgery. Int J Med Robot 10:495-504
Dargar, Saurabh; Lam, Benjamin; Horodyski, Crystal et al. (2014) A Decoupled 2 DOF Force Feedback Mechanism for the Virtual Translumenal Endoscopic Surgical Trainer (VTEST. Stud Health Technol Inform 196:86-8
Allen, Brian F; Jones, Daniel B; Schwaitzberg, Steven D et al. (2014) Survey-based analysis of fundamental tasks for effective use of electrosurgical instruments. Surg Endosc 28:1166-72
Chellali, A; Zhang, L; Sankaranarayanan, G et al. (2014) Validation of the VBLaST peg transfer task: a first step toward an alternate training standard. Surg Endosc 28:2856-62
Roche, Christopher A; Sankaranarayanan, Ganesh; Dargar, Saurabh et al. (2014) Kinematic Measures for Evaluating Surgical Skills in Natural Orifice Translumenal Endoscopic Surgery (NOTES). Stud Health Technol Inform 196:339-45
Sankaranarayanan, Ganesh; Li, Baichun; De, Suvranu (2014) A framework for providing cognitive feedback in surgical simulators. Stud Health Technol Inform 196:369-71
Ahn, Woojin; Dargar, Saurabh; Halic, Tansel et al. (2014) Development of a Virtual Reality Simulator for Natural Orifice Translumenal Endoscopic Surgery (NOTES) Cholecystectomy Procedure. Stud Health Technol Inform 196:1-5
Allen, Brian F; Schwaitzberg, Steven D; Jones, Daniel B et al. (2014) Toward the development of a virtual electrosurgery training simulator. Stud Health Technol Inform 196:11-3

Showing the most recent 10 out of 14 publications