There are about 3 million American men living with prostate cancer, the second leading cause of cancer death for men in the United States. If the prostate cancer is caught early before it spreads to other parts of the body, by active monitoring or treatment, most men will not die from it. Nevertheless, 22% to 47% of the patients with negative biopsies but elevated prostate-specific antigen levels may still harbor malignant tumors, which can be life threatening and could have been missed by the commonly used ultrasound guided random biopsy. By contrast, fusion of magnetic resonance (MR) imaging and transrectal ultrasound (TRUS) for guiding targeted biopsies has shown to significantly improve the cancer detection rate. However, MR-TRUS fusion itself is very challenging due to the difficulties in directly registering images of these two very different modalities in different dimensions. To bypass the difficult registration problems, the existing fusion techniques require the use of specialized expensive and cumbersome hardware tracking devices, which increases cost and elongates procedures. More importantly, due to a number of factors such as patient movement, respiratory motion and ultrasound transducer pressure change, prostate motion can happen during a procedure and cause the images to be misaligned. Timely noticing and correcting such motion require great skill and knowledge of radiological imaging, where studies show a steep learning curve for mastering fusion systems. Failing in image registration and motion compensation renders the fusion guided biopsy performing no differently than random biopsy. To address the fundamental cause of the problems, the goal of this project is to create enabling technology of MR- TRUS image fusion solely based on internal image content without using external tracking devices. The proposed research is foundational for developing next generation of MR-TRUS fusion guidance systems for prostate biopsy to achieve robust performance with lower costs. Recent advancement in machine learning, especially deep learning, has provided us new tools and new angles to tackle this challenging problem. This project aims for directly fusing 2D TRUS frames with 3D MR volume by developing novel deep learning methods for image reconstruction and registration. The proposed methods are designed to exploit both population and patient specific imaging information to accurately align images. As all learning-based image registration methods try to better use population knowledge to improve the registration performance, few of them have been able to efficiently use patient specific information, which can be essential to obtain robust and accurate performance. Upon successful completion, the innovation created from the project will disrupt the common perception that hardware tracking has to be used for multimodal image fusion-guided interventions and alleviate the demand on physicians? experience and skill in image analysis and fusion to help obtain consistent results. This project will lead to the development of novel prostate biopsy systems and will also impact a range of other image fusion based interventional guidance technologies.

Public Health Relevance

Fusion of magnetic resonance (MR) imaging and transrectal ultrasound (TRUS) for guiding targeted prostate biopsies can significantly improve the detection of aggressive cancer. The goal of this project is to create enabling technology of MR-TRUS image fusion solely based on internal image content without using external tracking devices. The proposed research is foundational for developing next generation of MR-TRUS fusion guidance systems for prostate biopsy to achieve robust performance with lower costs.

Agency
National Institute of Health (NIH)
Institute
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Type
Exploratory/Developmental Grants (R21)
Project #
5R21EB028001-02
Application #
9968409
Study Section
Biomedical Imaging Technology Study Section (BMIT)
Program Officer
Shabestari, Behrouz
Project Start
2019-07-01
Project End
2022-03-31
Budget Start
2020-04-01
Budget End
2021-03-31
Support Year
2
Fiscal Year
2020
Total Cost
Indirect Cost
Name
Rensselaer Polytechnic Institute
Department
Biomedical Engineering
Type
Biomed Engr/Col Engr/Engr Sta
DUNS #
002430742
City
Troy
State
NY
Country
United States
Zip Code
12180