AI Hits Real-World Radiology
Algorithms are expected to help radiologists do their jobs, but IT execs must figure out how best to deploy them.
As providers begin to see the potential of artificial intelligence and machine learning, IT must find cost-effective ways to put the technology to work.
Algorithms based on machine learning and deep learning, intended for use in diagnostic imaging, are moving into the commercial pipeline.
However, providers will have to overcome multiple challenges to incorporate these tools into daily clinical workflows in radiology.
There now are numerous algorithms in various stages of development and in the FDA approval process, and experts believe there could eventually be hundreds or even thousands of AI-based apps to improve the quality and efficiency of radiology.
The emerging applications based on machine learning and deep learning primarily involve algorithms to automate radiology processes such as detecting abnormal structures in images, such as cancerous lesions and nodules. The technology can be used on a variety of modalities, such as CT scans and X-rays. The goal is to help radiologists more effectively detect and track the progression of diseases, giving them tools to enhance speed and accuracy, thus improving quality and reducing costs.
While the number of organizations incorporating these products into daily workflows is small today, experts expect many providers to adopt these solutions as the industry overcomes implementation challenges.
For example, Signify Research predicts that the worldwide market for machine learning software in medical imaging to “automate detection, quantification, decision support and diagnosis” in radiology will exceed $2 billion by 2023, according to a report released in August by the research firm, which is based in the United Kingdom.
Signify says the growth of these types of tools will be fueled by the prevalence of cloud-based computing and storage solutions as well as the introduction of deep learning to analyze digital images.
In addition to technical factors, radiologists’ acceptance of AI also will fuel growth, according to Signify. “The interest and enthusiasm for AI in the radiologist community has notably increased over the past 12 to 18 months, and the discussion has moved on from AI as a threat to how AI will augment radiologists,” says Simon Harris, a principal at Signify.
Radiologists’ growing appreciation for AI may result from the technology’s promise to help the profession cope with an explosion in the amount of data for each patient case.
“As our systems improve, we start to acquire more and more images,” says Matt Dewey, CIO of Wake Radiology, which operates outpatient imaging centers in the Raleigh-Durham, N.C., area. One example is mammography, which is evolving from 2D to 3D imaging, he says. “We go from a study that used to be 64 megabytes for a normal, standard study to about 2 gigs, so it just takes the radiologist much
more time to go through. If we can find a way that a computer looks through it, it should make a difference. It can highlight things for the radiologist.”
Radiologists also are grappling with the growth in data from sources outside radiology, such as lab tests and electronic medical records. This is another area where AI could help radiologists by analyzing data from disparate sources and pulling out key pieces of information for each case, Dewey says.
There are other issues AI could address as well, such as “observer fatigue,” which is an “aspect of radiology practice and a particular issue in screening examinations where the likelihood of finding a true positive is low,” wrote researchers from Massachusetts General Hospital and Harvard Medical School in a 2018 article in the Journal of the American College of Radiology. These researchers foresee the utility of an AI program that could identify cases from routine screening exams with a likely positive result and prioritize those cases for radiologists’ attention.
AI software also could help radiologists improve worklists of cases in which referring physicians already suspect that a medical problem exists.
Radiologists learn of the potential seriousness of a given imaging study when a referring clinician labels it as STAT, explains Luciano Prevedello, MD, division chief in medical imaging informatics at Ohio State University Wexner Medical Center. Prevedello says this is not an ideal system for prioritizing workflow in radiology for two primary reasons. Sometimes, images show critical findings the ordering physicians did not anticipate, and even within the studies labeled STAT—about 40 percent of all studies—there are varying degrees of urgency.
Despite the promise of machine learning and deep learning in radiology, moving such algorithms from the realm of scientific discovery to daily clinical workflows involves overcoming practical and financial challenges.
Developing in-house solutions
Convinced of the potential of AI solutions for radiology, Ohio State currently is focusing on developing algorithms through in-house research.
Prevedello says the eight-member radiology-informatics team at Ohio State has developed an algorithm that prioritizes computed tomography images of the head based on whether there are critical findings. Researchers have configured the algorithm so it returns results in six seconds by processing CT images on a separate server and then sends a message about whether there is a critical finding for an imaging study to the worklist software within the PACS system.
The next step is to set up a clinical trial. “This is an important step to see if
what we developed in the lab can be expanded to a clinical setting,” Prevedello says. “If it is successful there, we will implement it in the clinical setting.” Ohio State has 65 attending physicians in its radiology department.
To build the tool, researchers trained an algorithm using a set of 2,583 head images and validated the tool with a second data set of 100 head images.
The team also is working to develop similar solutions for images of other body parts. “We hope to have a very comprehensive solution at some point,” he says.
Evaluating commercial options
Other providers are venturing into the commercial sector to evaluate potential solutions.
A challenge to overcome with this approach is the fragmented nature of the emerging marketplace for these algorithms. “I see these small startup AI companies that solve one or two problems. They do very narrow AI, which is a very reasonable approach,” says Don Dennison, president of Don K. Dennison Solutions, an imaging and informatics consulting firm. For example, they may focus on just one disease, body part or imaging modality.
“The problem is that now there are many, many, many of those, or tens of thousands of those combinations to be evaluated,” Dennison suggests.
That’s one of the problems executives at Wake Radiology—which also reads studies for UNC REX Healthcare’s inpatient and outpatient radiology departments—are trying to solve as they work to bring these applications into daily clinical use.
They do not want to go through the usual software purchase and implementation process every time they want to add a new algorithm. “Effectively, what I needed was something like an app store, where I can choose this algorithm or that algorithm,”
Dewey says. “I don’t want to talk to every company that has an algorithm. I want to just choose this app and know the platform is going to work.”
This is one of the reasons Wake became a beta site for a platform from EnvoyAI, which is designed to give physicians access to multiple algorithms through a single solution.
Wake Radiology is live with one algorithm, which determines bone age in pediatric patients. Senior management chose to start with this algorithm because measuring bone age is “a pretty standard calculation,” without a lot nuance, Dewey says.
To set up the process, Wake Radiology routes the pertinent imaging studies from its vendor-neutral archive to EnvoyAI’s on-premise platform, which anonymizes the data and routes it to the algorithm in the cloud. EnvoyAI then routes the results back to Wake’s PACS.
Ohio State’s Prevedello believes this type of platform solution has potential if AI developers are willing to customize their apps for multiple platforms, as is the case today with Google Play and Apple’s App store.
In addition to EnvoyAI, Nuance Communications has said it is working on such a platform, Nuance AI Marketplace for Diagnostic Imaging. Blackford Analysis also has announced a platform to integrate multiple algorithms and make them available in either a PACS or image viewer. The platform includes its own product, Smart Localizer, which automatically registers images, enabling radiologists to compare images taken at different times, on equipment from different vendors and in different modalities.
As Dennison explains, “The value proposition of any app store is the breadth and value of the apps it has and how easy it is to implement in production. The race is on to get the most, highest value apps installed in production in the most places.”
To scale these solutions, vendors are announcing partnerships. For example, Nuance has announced partners for its platform, such as NVIDIA, which has a deep learning platform, and Partners HealthCare. Blackford has announced a partnership with Intelerad Medical Systems, a vendor of PACS, viewers and related products.
Creating open systems
Even if these platforms build sizable customer bases, it’s still important to develop standard protocols for communicating the types of information these algorithms are likely to take in and report out, experts say.
The need for communications standards will become an even bigger issue in the future as the AI-based algorithms become increasingly robust, analyzing and processing data from many sources—such as images and data from EHRs or wearable devices. Even more challenging will be the process of managing outputs from AI algorithms located throughout providers’ enterprises, including many in different clinical specialties and administrative areas.
Without an open process, switching between platform vendors, replacing outdated algorithms or managing algorithms across multiple departments would be time consuming and expensive for providers, experts say.
Dennison says DICOM and HL7 API committees are already thinking about how to adapt existing standards to incorporate outputs from AI algorithms. He says Integrating the Healthcare Enterprise (IHE) International also plans to develop an integration profile for entities that create, store, manage and
display output from AI algorithms.
In addition to IT architecture challenges, there are people issues, such as confronting radiologists’ skepticism about AI’s merits. Citing the example of mammography, Dewey notes that computer-aided detection does not appeal to all radiologists because they worry that the applications used for this purpose may misinterpret information in the images.
In the case of the AI-based algorithms, however, radiologists have the option to feed information back into the algorithms when they disagree with the results—this would enable the models to learn continually and become increasingly accurate.
Radiologists at Wake also want to accept or reject the output from an algorithm on a specific imaging study before the data flows into the PACS. As a result, Wake plans to revise the current workflow, which it is using for the bone age calculation, so results from other algorithms are not incorporated directly into the PACS but instead will go to a separate viewer, Northstar AI Explorer, from TeraRecon.
For example, Dewey says, Wake Radiology plans to implement an algorithm, NeuroQuant, from CorTechs Labs, which measures the volumes of certain brain structures on MRI images to help detect dementia and neurodegenerative diseases. However, the algorithm does not currently consider data from outside sources, such as electronic medical records, that might be pertinent.
For instance, Dewey says, if a patient had surgery in which part of his or her brain was removed, that information must be considered before arriving at a diagnosis of dementia. That’s why it’s important for radiologists to review the outputs of the algorithm.
Wake Radiology also plans to use this approach to incorporate two other algorithms—one for CT lung images and a second for chest X-rays. It also intends to implement a feedback functionality to enable radiologists to help train the algorithm.
A subsequent phase of the project would integrate the viewer directly inside the PACs and integrate the outputs from the algorithms with automated dictation software, so radiologists won’t have to verbally communicate findings from the algorithm for their reports.
Another practical issue is hiring IT staff people with the skills to troubleshoot problems or optimize a platform or application, particularly as AI becomes more prevalent.
Searching for a model
But perhaps, the biggest hurdle will involve developing a feasible economic model that rewards algorithm developers for their work and providers for using the technology.
Hoping to improve the cost-benefit calculation among providers mulling whether to purchase a commercial product, Zebra Medical Vision last year announced that it will charge $1 per scan for access to its current and future cloud-based algorithms. Zebra has developed a deep learning tool that detects liver, cardiovascular, lung and bone diseases in imaging studies.
Ben Panter, CEO of Blackford Analysis, argues that the platform approach takes costs out of the system for both buyers and sellers. “Economical ways of deploying AI in healthcare are essential,” he says. “We need to develop more efficient ways than individual companies going to individual hospitals trying to make a sale. That drives the costs up so high that we are never going to solve any of the problems in healthcare.”
However, these approaches do not directly address the underlying tension of applying these AI models to the current process of reading and reporting on imaging studies in radiology.
Dennison notes that these algorithms do not replace radiologists but help them—a role he describes as a “virtual resident.” In this scenario, an outpatient imaging center or hospital has taken on a new layer of costs, but it hasn’t reduced the amount it pays the radiologist or increased the amount it gets reimbursed by a payer.
This won’t work financially unless the new software aids radiologists’ efficiency enough so they can review more images and produce more reports.
That is why providers are not likely to fully realize the value of their investments in AI-powered algorithms until they use those tools to replace— rather than assist—humans in some steps in the workflow, says Robert
Fuller, managing partner for healthcare at Clarity Insights, a consulting firm specializing in data analytics.
Take the example of chest X-rays that physicians order to screen patients for lung cancer, he says. After an algorithm determines that a given set of X-rays shows a negative finding, those results could move directly to the reporting phase of the process without review by a radiologist, while X-rays with abnormal findings would then be forwarded to radiologists to review.
“The only way you are going to drive out cost is by believing in the solution you put in place—proving it out; feeling comfortable with the accuracy—and reducing the workload,” he says. “It is more about getting people’s buy-in; this has to be a process where people accept the accuracy of the solution. That won’t happen overnight.” ☐