Automatic Pelvic Organ Delineation in Prostate Cancer Treatment Abstract: Fast, reliable and accurate delineation of pelvic organs in the planning and treatment images is a long- standing, important and technically challenging problem. Its solution is highly required for state-of-the-art image-guided radiation therapy planning and treatment, as better treatment decisions rely on timely interpretation of anatomical information in the images. However, automatic segmentation in male pelvic regions is always difficult due to 1) low contrast between prostate and surrounding organs, and 2) possibly disparate shapes/appearances of bladder and rectum caused by tissue deformations. The goal of this project is to create a set of novel machine learning tools to achieve accurate, reliable and efficient delineation of important pelvic organs (e.g., prostate, bladder, and rectum) in different modalities (e.g., planning CT, treatment CT/CBCT, and MRI) for radiotherapy of prostate cancer. Planning CT. For automatic segmentation, landmark detection is often the first step in rapidly locating the target organs. Thus, in Aim 1, we will create a novel joint landmark detection approach, based on both random forests and auto-context model, to iteratively detect all landmarks and further coordinate their detection results for achieving more accurate and consistent landmark detection results. After roughly locating organs with the aid of those detected landmarks, the second step is to accurately segment boundaries of target organs in the planning CT. Accordingly, in Aim 2, we will create a set of learning methods to a) first simultaneously predict all pelvic organ boundaries in the planning CT with the regression forests trained by labeled training data, and b) then segment all pelvic organs jointly by deforming their respective shape models. In particular, to address the limitations of conventional deformable models in assuming simple Gaussian distributions for organ shapes, a novel hierarchical sparse shape composition approach will be developed to constrain shape models during deformable segmentation. Treatment CT/CBCT. During the course of serial radiation treatments, to quantitatively record and monitor the accumulated dose delivered to the patient, organs in the treatment image also need to be segmented. Although methods proposed in Aims 1-2 can be simply applied, as done by many conventional methods, this will lead to a) inconsistent landmark detection and b) inconsistent segmentations across different treatment days because of possible large shape/appearance changes. Accordingly, in Aim 3, we will create a novel self- learning mechanism to gradually learn and incorporate patient-specific information into both joint landmark detection and deformable segmentation steps from the increasingly acquired treatment images of patient. Thus, population data will gradually be replaced by the patient's own data to train personalized models. MRI. To guide pelvic organ segmentation in the planning CT, MRI is now often acquired for selected patients. To this end, in Aim 4, we will develop a) a prostate MRI segmentation method by using deep learning to learn MRI-specific features for guiding landmark detection and deformable segmentation as proposed in Aims 1-2; b) a novel collaborative MRI and CT segmentation algorithm for more accurate segmentation of planning CT. All our developed algorithms will be evaluated for their performance in clinical (treatment planning and delivery) workflow for 130 patients in UNC Cancer Hospital and also hospitals of our consultants. Benefit for Patient Care. Development of these segmentation tools will 1) dramatically accelerate the clinical workflow, 2) reduce workload (i.e., manual interaction time) for physicians, and 3) lead to better patient outcomes with reliable and accurate segmentations of target area and critical organs. Although these tools cannot replace the expertise of physicians, they can be of great assistance to physicians.

Public Health Relevance

Description of Project This project aims to create a set of novel machine learning tools for accurate, reliable and efficient delineation of important pelvic organs (such as prostate, bladder, and rectum) in various imaging modalities to help radiotherapy of prostate cancer. To achieve this goal, we will create 1) a novel joint landmark detection and boundary prediction framework for accurate segmentation of the planning CT images, 2) an innovative updating mechanism for incorporating patient-specific information from previous treatment images of same patient to progressively improve treatment CT/CBCT segmentation, and 3) a deep-learning based framework for learning MRI-specific features for prostate MRI segmentation, as well as a novel method for collaborative MRI & planning CT segmentation. We will validate these methods on real patients in our clinical workflow. Besides, although these machine-learning tools are designed for radiotherapy of prostate cancer, they can also be easily extended to other new organs and new modalities after appropriate updating and training.

Agency
National Institute of Health (NIH)
Institute
National Cancer Institute (NCI)
Type
Research Project (R01)
Project #
5R01CA206100-03
Application #
9527787
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
Redmond, George O
Project Start
2016-09-01
Project End
2021-07-31
Budget Start
2018-08-01
Budget End
2019-07-31
Support Year
3
Fiscal Year
2018
Total Cost
Indirect Cost
Name
University of North Carolina Chapel Hill
Department
Radiation-Diagnostic/Oncology
Type
Schools of Medicine
DUNS #
608195277
City
Chapel Hill
State
NC
Country
United States
Zip Code
27599
Feng, Zishun; Nie, Dong; Wang, Li et al. (2018) SEMI-SUPERVISED LEARNING FOR PELVIC MR IMAGE SEGMENTATION BASED ON MULTI-TASK RESIDUAL FULLY CONVOLUTIONAL NETWORKS. Proc IEEE Int Symp Biomed Imaging 2018:885-888
Zhang, Yongqin; Shi, Feng; Cheng, Jian et al. (2018) Longitudinally Guided Super-Resolution of Neonatal Brain Magnetic Resonance Images. IEEE Trans Cybern :
Cao, Xiaohuan; Yang, Jianhua; Gao, Yaozong et al. (2018) Region-adaptive Deformable Registration of CT/MRI Pelvic Images via Learning-based Image Synthesis. IEEE Trans Image Process :
Zhensong Wang; Lifang Wei; Li Wang et al. (2018) Hierarchical Vertex Regression-Based Segmentation of Head and Neck CT Images for Radiotherapy Planning. IEEE Trans Image Process 27:923-937
Ren, Xuhua; Xiang, Lei; Nie, Dong et al. (2018) Interleaved 3D-CNNs for joint segmentation of small-volume structures in head and neck CT images. Med Phys 45:2063-2075
Cao, Xiaohuan; Yang, Jianhua; Zhang, Jun et al. (2018) Deformable Image Registration Using a Cue-Aware Deep Regression Network. IEEE Trans Biomed Eng 65:1900-1911
Trullo, Roger; Petitjean, Caroline; Nie, Dong et al. (2017) Joint Segmentation of Multiple Thoracic Organs in CT Images with Two Collaborative Deep Architectures. Deep Learn Med Image Anal Multimodal Learn Clin Decis Support ( 10553:21-29
Trullo, Roger; Petitjean, Caroline; Nie, Dong et al. (2017) Fully automated esophagus segmentation with a hierarchical deep learning approach. Conf Proc IEEE Int Conf Signal Image Process Appl 2017:503-506
Trullo, R; Petitjean, C; Ruan, S et al. (2017) SEGMENTATION OF ORGANS AT RISK IN THORACIC CT IMAGES USING A SHARPMASK ARCHITECTURE AND CONDITIONAL RANDOM FIELDS. Proc IEEE Int Symp Biomed Imaging 2017:1003-1006
Yin, Qingbo; Hung, Sheng-Che; Wang, Li et al. (2017) Associations between Tumor Vascularity, Vascular Endothelial Growth Factor Expression and PET/MRI Radiomic Signatures in Primary Clear-Cell-Renal-Cell-Carcinoma: Proof-of-Concept Study. Sci Rep 7:43356

Showing the most recent 10 out of 22 publications