Fluoroscopy guidance using C-arm X-ray systems is used in more than 17 million procedures across the US and constitutes the state-of-care for various percutaneous procedures, including internal ?xation of pelvic ring injuries. To infer procedural progress from 2D radiographs, well-de?ned views onto anatomy must be achieved and restored multiple times during surgery. This process, known as ??uoro hunting?, is associated with 4.7 s of excessive ?uoroscopy time per C-arm position (c. f. 120 s total per ?xation), yielding radiographs that are never interpreted clinically, but drastically increasing procedure time and radiation dose to patient and surgical staff. Our long-term project goal is to use concepts from machine learning and active vision to develop task-aware algorithms for autonomous robotic C-arm servoing that interpret intra-operative radiographs and autonomously adjust the C-arm pose to acquire ?uoroscopic images that are optimal for inference. We have three speci?c aims: 1) Detecting unfavorable K-wire trajectories from monoplane ?uoroscopy images: We will extend a physics-based sim- ulation framework for ?uoroscopy from CT that enables fast generation of structured and realistic radiographs documenting procedural progress. Based on this data, we will train a state-of-the-art convolutional neural net- work that interprets ?uoroscopic images to infer procedural progress. 2) Developing and validating a task-aware imaging system in silico: Using the autonomous interpretation tools and simulation pipeline available through Aim 1, we will train an arti?cial agent based on reinforcement learning and active vision. This agent will be capable of analyzing intra-operative ?uoroscopic images to autonomously adjust the C-arm pose to yield task- optimal views onto anatomy. 3) Demonstrating feasibility of our task-aware imaging concept ex vivo:
Our third aim will establish task-aware C-arm imaging in controlled clinical environments. We will attempt internal ?xation of anterior pelvic ring fractures and our task-aware arti?cial agent will interpret intra-operatively acquired ra- diographs to infer procedural progress and suggest optimal C-arm poses that will be realized manually with an optically-tracked mobile C-arm system. This work combines the expertise of a computer scientist, a surgical robotics expert, and an orthopedic trauma surgeon to explore the untapped, understudied area of autonomous imaging enabled by advances in machine learning in ?uoroscopy-guided procedures. This development has only recently been made feasible by innovations in fast ?uoroscopy simulation from CT to provide structured data for training that is suf?ciently realistic to warrant generalization to clinical data. With support from the NIH Trailblazer Award, our team will be the ?rst to investigate autonomous and task-aware C-arm imaging systems, paving the way for a new paradigm in medical image acquisition, which will directly bene?t millions of patients by task-oriented image acquisition on a patient-speci?c basis. Subsequent R01 funding will customize this concept to other high-volume procedures, such as vertebroplasty.

Public Health Relevance

Fluoroscopy guidance using C-arm X-ray systems is the state-of-care for percutaneous fracture ?xation, and requires surgeons to achieve and reproduce well de?ned views onto anatomy to infer procedural progress. This requirement alone is estimated to contribute 4.7 s of ?uoroscopy time per C-arm repositioning (c. f. 120 s total per ?xation), drastically increasing procedure time and radiation dose to patient and surgical team. The goal of this project is to develop machine learning-based C-arm servoing algorithms that introduce task awareness by interpreting intra-operative radiographs and autonomously adjusting the C-arm pose to task-optimal views.

Agency
National Institute of Health (NIH)
Institute
National Institute of Biomedical Imaging and Bioengineering (NIBIB)
Type
Exploratory/Developmental Grants (R21)
Project #
5R21EB028505-02
Application #
10143238
Study Section
Imaging Guided Interventions and Surgery Study Section (IGIS)
Program Officer
Shabestari, Behrouz
Project Start
2020-04-15
Project End
2022-12-31
Budget Start
2021-01-01
Budget End
2021-12-31
Support Year
2
Fiscal Year
2021
Total Cost
Indirect Cost
Name
Johns Hopkins University
Department
Biostatistics & Other Math Sci
Type
Biomed Engr/Col Engr/Engr Sta
DUNS #
001910777
City
Baltimore
State
MD
Country
United States
Zip Code
21218