This proposal is a collaborative effort between Yale University and the medical device company Eigen to develop a more robust and accurate method for the deformable fusion of pre-operative magnetic resonance images (MRI) and intra-procedure trans-rectal ultrasound (TRUS) images to guide prostate biopsy. Prostate cancer is a major cause of death in the U.S. and biopsy is the gold-standard for its diagnosis. Each year over 1.1 million biopsies are performed in the U.S. alone at a cost of well over $1 billion. Most of these biopsies are performed under TRUS guidance in a systematic fashion but blinded to potential tumor locations because many tumors are invisible on ultrasound. This procedure is highly inefficient and as many as 30% of serious tumors may be missed on a first-time biopsy. Multi-parametric MRI (mpMRI) acquired before the biopsy can be used to define potential targets and then used in conjunction with live TRUS imaging to perform targeted image-guided biopsy. A key challenge is the accurate registration (fusion) of the intra-procedure TRUS with the pre-acquired MRI. This fusion procedure is often performed using rigid registration which does not account for prostate deformation between the MRI and TRUS acquisitions. Eigen's Artemis system is one of a newer generation of devices that performs deformable mapping between MRI and TRUS. Current fusion methods (such as in Artemis), however, are highly operator dependent as they rely on semi-automated TRUS segmentation by the clinician during the procedure which is error prone, time-consuming and critically reliant on operator skill. We propose to develop and test an innovative image analysis strategy for deformable registration based on a statistical deformation model learned from an existing large database (N=100) of MRI- TRUS prostate image pairs. The model will reduce the dimensionality of the search as well as constrain the deformation. The proposed registration method also incorporates a novel regional confidence estimate for the intra-procedure interactive TRUS segmentation in order to enhance robustness by focusing the registration on the more confident portions of the bounding surface. These methods are intended to streamline the clinician's workflow by providing a more accurate and robust registration with less dependence on segmentation errors. This project is significant in that it has the potential to improve the utilization (outside of major academic hospitals) of MRI-TRUS fusion guided prostate biopsy by alleviating the dependence of the workflow on exact prostate segmentation from often poor quality TRUS images.
In Aim 1, we will develop this new procedure and show its superior reproducibility as compared to current fusion methods in order to demonstrate the robustness of our method to variations in prostate segmentation.
In Aim 2, we will perform a clinical validation study to demonstrate our method's superior accuracy (as measured by landmark registration error) compared to the current non-rigid registration method in our Artemis system. In a potential Phase II application, we will perform a multi-site clinical validation and then incorporate the new method in future commercial versions of Artemis.
Prostate cancer is the 2nd most commonly occurring form of cancer behind non-melanoma skin cancer and biopsy is the clinical gold standard for diagnosis of the disease. This grant is aimed at enhancing the accuracy and robustness of MRI-Ultrasound fusion-guided prostate biopsies with an expected improved detection of disease resulting in earlier diagnosis and improved survival. At the core of the effort is the development and evaluation of novel image analysis methods for MRI-Ultrasound registration using a statistical deformation model based on population statistics of prostate deformation.
|Onofrey, John A; Staib, Lawrence H; Sarkar, Saradwata et al. (2015) LEARNING NONRIGID DEFORMATIONS FOR CONSTRAINED POINT-BASED REGISTRATION FOR IMAGE-GUIDED MR-TRUS PROSTATE INTERVENTION. Proc IEEE Int Symp Biomed Imaging 2015:1592-1595|