Screening mammography saves lives but human interpretation alone is imperfect and is associated with significant harms including ~30,000 missed breast cancers and ~3.8 million false-positives exams each year in the U.S. alone. Traditional computer-aided detection failed to improve screening accuracy, in part due to the static nature of software trained and tested on small datasets decades ago. Recent advances in improved computer processing power, cloud-based data storage capabilities, and availability of large imaging datasets have led to renewed excitement for applying artificial intelligence (AI) to mammography interpretation. We propose a unique academic-industry partnership to validate, refine, scale, and clinically translate our proven 2D mammography AI algorithm to 3D mammography interpretation. Our team helped organize and lead the Dialogue for Reverse Engineering Assessments and Methods (DREAM) Digital Mammography Challenge, an open crowdsourced AI algorithmic challenge that provided >640,000 digital 2D mammogram images and associated clinical metadata to >1,200 coding teams worldwide. Our industry partner for this grant, DeepHealth, Inc., was the top performing team in the DREAM Challenge. With >50% of U.S. facilities now offering 3D mammography for screening, the 50-to-100-fold increase in imaging data represents a new critical barrier for both radiologists and AI algorithm developers. To date, there have been few publications addressing AI-enhanced interpretation of 3D mammography, the emerging screening exam of choice. We will validate our post-DREAM algorithm for 2D mammography (which currently rivals human interpretation alone) using UCLA's Athena Breast Health Network, one of the largest population-based breast imaging registries. We will enhance our 2D AI algorithm with expert radiologist supervision and examine the impact of adding novel non-imaging data parameters, including genetic mutation and tumor molecular subtype data, to help train the AI model to identify more clinically significant cancers. We will use several novel technical algorithmic approaches to scale from 2D to 3D mammography which, in our preliminary studies, have shown improved accuracy beyond radiologist interpretation alone. Finally, we will perform a series of interpretive studies to identify the optimal interface between ?black box? outputs and radiologist interpreters, which remains an understudied topic. With >40 million U.S. women undergoing screening each year, seemingly small improvements in overall accuracy would still imply significantly improved population-based outcomes. In summary, we have assembled an unparalleled multidisciplinary team with expertise in machine/deep learning, breast cancer screening accuracy, medicine, oncology, radiology, imaging technology assessment, and biostatistics. We have a proven track record of strong collaboration and are well positioned to validate, enhance, scale, and translate our proven 2D AI algorithm for improved 3D mammography accuracy. Our new end user tool will help tip the balance of routine screening towards greater benefits than harms.

Public Health Relevance

Current interpretation of screening mammography suffers from human limitations, leading to approximately 30,000 missed cancers and 3.8 million false positives exams every year in the U.S. alone. Our multidisciplinary team of experts on breast cancer screening, machine and deep learning, data science, and imaging technology assessment will validate our highly accurate 2D mammography artificial intelligence (AI) algorithm that was the best performer in an international competition with >1,200 participants and then further enhance it with novel AI augmentation methods. We will then apply innovative techniques to scale the AI algorithm from 2D to 3D mammography, addressing the issue of a 50-to-100-fold increase in volumetric data with 3D exams, and then clinically translate our optimized AI 3D mammography tool through a series of interpretive accuracy studies involving experienced and inexperienced radiologists from both academic and community practices.

Agency
National Institute of Health (NIH)
Institute
National Cancer Institute (NCI)
Type
Research Project (R01)
Project #
1R01CA240403-01A1
Application #
9912472
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
Hartshorn, Christopher
Project Start
2020-01-01
Project End
2024-12-31
Budget Start
2020-01-01
Budget End
2020-12-31
Support Year
1
Fiscal Year
2020
Total Cost
Indirect Cost
Name
University of Washington
Department
Radiation-Diagnostic/Oncology
Type
Schools of Medicine
DUNS #
605799469
City
Seattle
State
WA
Country
United States
Zip Code
98195