Our ultimate goal is to provide a plug-and-play unified real-time communication (URTC) network in operating rooms (ORs) for image-guided therapy (IGT), where imagers, sensors, surgical robots, and computers from different vendors work cooperatively. URTC will ensure the seamless data flow among those components to enable the closed-loop process of planning, control, delivery, and feedback in IGT. The closed- loop process makes IGT interactive and adaptive, hence accurate particularly when the configuration is rapidly changing. None of existing standards truly support URTC in ORs. We are uniquely positioned to address this unmet need with our OpenIGTLink protocol; thanks to strong supports by popular free and commercial software packages, it has been used extensively in at least 89 peer-reviewed publications since 2008. Image-guidance is a basis for a wide range of minimally-invasive interventions, from percutaneous ablations to robotic surgeries. It enables accurate targeting and monitoring with minimal incisions by mapping image information onto the operating field. Image-guidance is a closed-loop process; a physician plans how to approach to a lesion using images (planning), maneuvers surgical tools to the lesion by hand or using a robotic device (control), delivers a therapeutic effect (delivery), monitors the effect using imagig and/or sensors (feedback), and update the plan accordingly. A cycle of this closed-loop process must be short enough to keep up with rapid changes in the operating field such as target displacement due to organ motion, or temperature variation during thermal ablations. Making the process closed-loop is a challenge, especially when the process involves various hardware/software components from different vendors such as imagers, surgical robots, and navigation software; those components need to communicate in real-time during the process, though there is no URTC standard suitable for the closed-loop process in IGT. Because of the lack of standard, venders can only provide proprietary communication interfaces, which are not interoperable. OpenIGTLink aims to address this unmet need in the IGT research field. It has been well accepted in the research community. However, the current OpenIGTLink puts more emphasis on the simplicity for academic research rather than the versatility required in medical-grade devices; therefore, OpenIGTLink has yet to be a de facto standard that the industry can rely on. Without the industrial support, the merit of OpenIGTLink remains somewhat limited. The goals of project are twofold: improving the protocol's advantage in real-time communication, and to improve the versatility of OpenIGTLink by providing interoperability with an existing medical record standard. We will pursue the following aims:
(Aim 1) Define OpenIGTLink message exchange schemes for real-time communication with quality of service (QoS) control;
(Aim 2) Investigate and develop interoperability with the DICOM standard;
(Aim 3) Validate extended OpenIGTLink in a robotic system for closed-loop image-guided robot-assisted prostate interventions.

Public Health Relevance

Image-guided systems have been used during surgical interventions to navigate physicians to identify abnormal regions and critical structures in the patient with minimal incision; those systems help improving safety and accuracy of surgical interventions by providing a closed-loop feedback to the physicians. While image-guided systems are demanded in many clinical fields, development of such systems is costly and time- consuming due to a lack of standard for integrating necessary components computers, sensors, and imaging scanners to build a system. The goal of this project is to develop a common 'plug-and-play' interface to speed up the research and development of image-guided systems, and make them available in a wider area of surgical interventions.

Agency
National Institute of Health (NIH)
Institute
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Type
Research Project (R01)
Project #
5R01EB020667-02
Application #
9100712
Study Section
Biodata Management and Analysis Study Section (BDMA)
Program Officer
Pai, Vinay Manjunath
Project Start
2015-07-01
Project End
2018-06-30
Budget Start
2016-07-01
Budget End
2017-06-30
Support Year
2
Fiscal Year
2016
Total Cost
Indirect Cost
Name
Brigham and Women's Hospital
Department
Type
DUNS #
030811269
City
Boston
State
MA
Country
United States
Zip Code
Tokuda, Junichi; Chauvin, Laurent; Ninni, Brian et al. (2018) Motion compensation for MRI-compatible patient-mounted needle guide device: estimation of targeting accuracy in MRI-guided kidney cryoablations. Phys Med Biol 63:085010
Wartenberg, Marek; Schornak, Joseph; Gandomi, Katie et al. (2018) Closed-Loop Active Compensation for Needle Deflection and Target Shift During Cooperatively Controlled Robotic Needle Insertion. Ann Biomed Eng 46:1582-1594
Moreira, Pedro; Patel, Niravkumar; Wartenberg, Marek et al. (2018) Evaluation of robot-assisted MRI-guided prostate biopsy: needle path analysis during clinical trials. Phys Med Biol 63:20NT02
de Arcos, Jose; Schmidt, Ehud J; Wang, Wei et al. (2017) Prospective Clinical Implementation of a Novel Magnetic Resonance Tracking Device for Real-Time Brachytherapy Catheter Positioning. Int J Radiat Oncol Biol Phys 99:618-626
Frank, Tobias; Krieger, Axel; Leonard, Simon et al. (2017) ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment. Int J Comput Assist Radiol Surg 12:1451-1460
Kapur, Tina; Pieper, Steve; Fedorov, Andriy et al. (2016) Increasing the impact of medical image computing using community-based open-access hackathons: The NA-MIC and 3D Slicer experience. Med Image Anal 33:176-180
Tani, Soichiro; Tatli, Servet; Hata, Nobuhiko et al. (2016) Three-dimensional quantitative assessment of ablation margins based on registration of pre- and post-procedural MRI and distance map. Int J Comput Assist Radiol Surg 11:1133-42