This project will develop and validate technology that will provide surgeons with new and powerful means to examine and interact with preoperative patient data -- in a manner that will improve patient safety, increase surgeon confidence, shorten procedure times, and improve outcomes. Our underlying hypothesis is that, by providing surgeons with a high-fidelity interactive simulation environment in which they can visually and physically interact with very realistic patient-specific simulations, that they will be able to plan and then rehearse a patient's procedures with sufficient immersion that performing the actual procedure will feel familiar and can be performed with confidence, precision and improved outcomes. These broad goals will be achieved by accomplishing four underlying aims including 1) the design and development of a visuohaptic workstation (enabling stereo visual and haptic touch interactions with the medical simulations) and creation of advanced haptic rendering algorithms to enable interaction of virtual tools with detailed patient-specific anatomy (derived from volumetric image data), 2) the development methods for photo-realistic rendering of large volumetric datasets acquired from patient-specific clinical imaging (including realistic rendering of translucent bio-materials and wet surfaces) to portray subtle visual cues essential for making surgical decisions, 3) the development methods for real-time, physics-based simulation of interactions between rigid bodies, and deformable tissues (including models and algorithms needed to simulate surgical manipulations such as retraction, incision and resection). In achieving these first three aims we will create a cost-effective visiohaptic workstation for rehearsal by exploiting emerging multi-core and graphics processing units, and high quality graphics and haptic displays, the costs all of which benefit from increasing consumer demand and technical breakthroughs. Finally, our fourth aim will be to assess the accuracy and utility of our patient-specific surgical rehearsal environment for use in cranial-base surgery (focusing on procedures limited to the poster lateral and anterior cranial base) and validate our hypothesis by study of i) subjective "realism" and ease of use, ii) ability to predict exposure of critical anatomy, and iii) its impact on surgical cases.

Public Health Relevance

It is common for a doctor today to carefully study diagnostic images prior to embarking on a clinical procedure. Even 50 years ago, though the quality of early X-ray images was poor, from the very beginning provided and will continue to provide life-saving guidance. With the ever increasing quantity and quality of diagnostic information available, it is easy to see how doctors could become overwhelmed with all the data. This project will take the next important step in enabling doctors to interact effectively with highly dense sources of 3D information;by enabling them interact in real- time with physical models of their actual patients, using their visual and touch abilities. Enabling doctors to plan different approaches to a procedure and then rehearse the actual procedure before entering the operating room promises to provide new levels of assurance and success in the surgical treatment of disease, restorations, trauma and more.

Agency
National Institute of Health (NIH)
Institute
National Library of Medicine (NLM)
Type
Research Project (R01)
Project #
5R01LM010673-04
Application #
8724556
Study Section
Biomedical Library and Informatics Review Committee (BLR)
Program Officer
Sim, Hua-Chuan
Project Start
2011-09-01
Project End
2015-08-31
Budget Start
2014-09-01
Budget End
2015-08-31
Support Year
4
Fiscal Year
2014
Total Cost
$309,722
Indirect Cost
$102,967
Name
Stanford University
Department
Surgery
Type
Schools of Medicine
DUNS #
009214214
City
Stanford
State
CA
Country
United States
Zip Code
94305
Forsslund, Jonas; Chan, Sonny; Selesnick, Joshua et al. (2013) The effect of haptic degrees of freedom on task performance in virtual surgical environments. Stud Health Technol Inform 184:129-35