AI Use Cases: What Radiologists Need to Know
While computer science experts understand how to train computers, it falls to radiologists to help AI developers understand which problems need to be solved to improve patient care.
Working to solve problems that don’t exist is a waste of everyone’s time. Preventing that fruitless pursuit is a key goal of the ACR Data Science Institute™ (DSI) Data Science Subspecialty Panels.
Use cases are the mechanism by which DSI communicates to the AI developer community the tasks AI could perform that would be useful to radiologists and could improve patient care. More specifically, a use case is a narrative description and flow chart defining the goal of an AI algorithm — including required clinical inputs — and describing how the algorithm would best integrate into radiologists’ workflows alongside other radiology tools.
Why Develop Use Cases?
While computer science experts understand how to train computers to process images, they often do not understand which parts of a radiologic study are most relevant to patient care. As radiologists, we take this knowledge for granted. For someone with no medical background, however, it is next to impossible to determine what is clinically relevant without some guidance.
Clinical relevancy may involve identification of specific imaging findings, accurate quantification of imaging biomarkers, classification of pathological conditions, or a change in appearance from an earlier study to a later study. Even when algorithms are capable of identifying a clinically relevant finding or pathological condition, it is important to ask: Is this a problem worth solving?
While an AI algorithm could train a computer to count the fingers on a hand radiograph, there would be no benefit to the radiologist reading the study. A better alternative would be a narrow AI system that could quietly evaluate every visible scaphoid bone on a hand or wrist radiograph, then alert the radiologist when it detects a fracture.
An algorithm that supports detection and classification of certain fractures also would be a welcome addition to most radiologists’ armamentarium, if it is integrated with existing systems and doesn’t hinder the workflow. To be accepted by the market and provide value-added patient care, an algorithm will need to be clinically useful and readily integrated into radiologists’ standard workflow. And that’s where the ACR DSI Data Science Subspecialty Panels come in.
How Are DSI Use Cases Novel?
ACR formed the DSI in 2017 with the goal of creating a framework for implementation of machine learning in the radiological professions. From the start, DSI set out to define clinically relevant use cases for the development of AI algorithms in medical imaging, IR, and radiation oncology. The Data Science Subspecialty Panels were formed early on to begin considering a broad range of possible use cases for AI and make recommendations on their potential impact.
The panels are comprised of radiologists with diverse backgrounds and include both academic and private practice radiologists. Panel members collaborate to identify relevant AI use cases and prioritize them for use by developers to build relevant AI algorithms. Some panels include radiologists who spend a significant portion of time in industry. These panel members understand the perspective of a radiology software/services vendor and can provide input as to which projects are feasible and which are unrealistic at this point in AI development.
How Do the Use Case Panels Work?
The dozens of volunteer experts recruited by DSI in 2017 to begin developing use cases were tasked with creating a usable framework for AI algorithm developers, then build on that framework to create clinically relevant use cases. Panel volunteers included physicians, medical physicists, data scientists, and software engineers, among others.
Once the panels were assembled, each radiologist was asked to submit two or three problems from their practices that might be amenable to an AI solution. Panelists then made “elevator pitches” for their concepts during panel meetings and other panel members provided comments.
At this early stage, ideas for potential use cases were only discarded if the panel felt they were not within the scope of the specific panel or were not considered to need specific input from radiologists. The vast majority of use cases that were proposed moved forward to the next round of review. Details of concepts were then put into templates provided by the DSI.
During panel conference calls over the next few months, individuals presented their draft concepts and received feedback on various aspects of each proposed use case. After the calls, DSI staff incorporated changes to the drafts and sent them on to the presenting panel member for editing.
What’s on the Horizon for Use Cases?
To achieve success, panels will rely on a broad array of stakeholders — including individual ACR members, academic departments, other radiology societies, and the developer community itself — to submit fresh ideas for use cases. This will help keep the AI engine running and create a best-in-class directory of hundreds of AI use cases.
By Jay W. Patti, MD, chair of an ACR Data Science Institute™ use case panel and chief radiology informatics officer at Mecklenburg Radiology Associates in Charlotte, N.C.