By Beth W. Orenstein
Vol. 22 No. 6 P. 20
Analytics can help expand AI’s role in radiology.
It seems as though radiologists have had a love-hate relationship with AI. On one hand, they can point to instances where AI has helped tremendously. For example, a recent study by a research team at New York University’s (NYU) Grossman School of Medicine, recently published in Digital Medicine, using analytics data from 3,661 patients, allowed researchers to predict, with up to 80% accuracy, which COVID-19 patients would develop life-threatening complications within four days so treatment resources could be best matched to those at increased risk.
“In our reader study, we compared our AI system to two radiologists with varying years of experience,” says Farah Shamout, DPhil, an assistant professor/emerging scholar of electrical and computer engineering in the engineering division of NYU Abu Dhabi. While the researchers couldn’t reach conclusive results with such a small pool of readers, Shamout says, “we hypothesized that our system outperforms radiologists in this task due to the then-limited clinical understanding of which pulmonary parenchymal patterns predict clinical deterioration, rather than the severity of lung involvement, which is what radiologists are used to interpreting.”
However, Raym Geis, MD, FSIIM, FACR, an adjunct associate professor of radiology at National Jewish Health in Colorado and a senior scientist at the ACR Data Science Institute, says he’s found that AI doesn’t perform as well “in controlled research environments or [as] start-up companies would have us believe.” AI is not worthless, he says, but it’s not the “panacea everyone thought it might be years ago, at least not in terms of looking at images and saying, ‘Oh, I can read this exam. I can find the cancer. I can tell you what kind of disease you have.’”
Radiology’s rocky relationship with AI may be due to the way it evolved. In the mid-1980s, radiologists began to explore computer-aided detection (CAD), an early version of AI, as a possible aid for reading mammograms. Despite the early hype, however, radiologists found that because of the relatively high rate of false-positive exams, “AI” mammography was of limited value, says Eliot Siegel, MD, a professor and vice chair in the department of diagnostic radiology at the University of Maryland School of Medicine and chief of radiology and nuclear medicine for the VA Maryland Healthcare System.
“Eighty-nine percent of mammographers used CAD primarily because it created additional reimbursement, but very few actually relied on it,” Siegel says. To avoid missing lesions, “most early implementations of AI also had high false-positive rates. Also, the sensitivity/specificity curve was not adjustable on those early CAD implementations and, for the most part, is not on current AI applications,” he adds.
Another initial issue with AI, Siegel notes, is that AI algorithms were geared more toward what was easily available in public databases, such as lung nodule detection or pneumonia detection or what was relatively easy to turn into a deep learning algorithm, such as pediatric bone age determination. “It didn’t really emphasize the types of cases that radiologists found most difficult/challenging in routine clinical practice,” Siegel says. Also, AI programs did not typically look for change over time but were just used to detect, diagnose, or quantify findings in a single study. Except for studies done in emergency departments, radiologists are typically looking for change from prior studies, Siegel says.
Some in radiology also were fearful that AI would eliminate the need for their services. “Just a few years ago, radiologists were terrified,” says Paul Chang, MD, FSIIM, a professor of radiology and vice chair of radiology informatics at University of Chicago Medicine. “They heard people saying, ‘AI is going to replace us.’”
Recently, however, radiology has come to embrace AI with more zeal. One reason is radiologists have seen that AI won’t replace them but rather enhance their ability to do their jobs. Time and experience have proven the belief that AI would replace radiologists was “a false and arrogant perspective,” enabling radiologists to move on, Chang says.
Another reason, Geis says, is probability-based decision-making computer programs use more sophisticated algorithms that are trained on data. While Geis personally believes “AI” is a “clickbait phrase,” he agrees that algorithms “potentially provide better predictions” than radiologists could previously obtain. AI, or as Geis prefers to call it, computer vision machine learning (ML), has been shown in other industries to help with logistics of people, resources, and scheduling, he says.
“It helps predict no-shows to radiology appointments. It’s already being used inside scanning equipment to make the equipment better and faster,” Geis says. “These analytic areas aren’t as sexy or charismatic as reading an MRI scan ‘better than doctors,’ but they actually work reliably in practice and at scale, and their returns on investment [ROI] are potentially enormous.”
Siegel agrees that the potential for AI to improve workflow, especially for improved follow-up communication of radiology reports/results/recommendations, is tremendous. Also, Siegel says, although it hasn’t happened yet, AI has the potential to help implement effective solutions to hanging protocols. In the nearly 30 years since PACS was introduced, no major PACS vendor has implemented effective solutions for hanging protocols. But, thanks to AI, “I think these are coming and will have a positive substantial impact,” Siegel says.
So how do radiologists convince the C-suite to embrace AI and encourage its use? In a word: analytics. “Administrators will find AI persuasive if it improves patient outcomes and decreases costs,” Shamout says.
Chang agrees: “What moves the needle in the C-suite is anything that improves productivity, reduces variability, reduces errors, and improves efficiency, and you can use analytics to show that AI does that in radiology.”
IT Infrastructure Needed
Before talking analytics with the C-suite, Chang notes that a “sort-of Catch-22” has to be resolved. AI is highly data intensive, he says. “It’s a statistical methodology. That’s an oversimplification but not by much.” Because AI is data driven, it requires robust and secure data access and interoperability, and that interoperability is generally lacking for most hospitals, which is what leads to the Catch-22.
“If you go to the decision makers and say, ‘We need a budget for AI, and AI requires this robust IT infrastructure,’ the problem is the C-suite will come back to you and say, ‘We are not going to invest in IT infrastructure unless you give us this killer AI app,’ Chang says. “However, you can’t build your killer AI app unless you invest in the IT infrastructure.”
Therefore, Chang believes, radiology should “hitch its wagon to analytics” for two reasons: The C-suite is already sold on the utility of investing in IT for improved analytics, and these IT investments also help AI initiatives. “In addition, radiologists need to embrace analytics to show evidence that AI algorithms can do things that actually can help us,” he says.
The time to do this is now, and it couldn’t be more important, Chang says. According to Chang, radiology is “barely hanging on.” It’s not just because of the global pandemic; the demand for services of growing complexity in radiology is increasing, while the number of people working in the field is decreasing. The demand for radiology is on the rise, thanks to an aging population that requires more frequent medical attention, rising emergency department volumes where advanced CT and MRI procedures have become more accessible, and rising demand for teleradiology services. The number of radiologists is declining, thanks to their ranks aging like the general population, fewer people pursing radiology training, and hiring challenges, eg, recruiting, becoming more competitive. AI could help to achieve and maintain a much-needed balance between demand for radiology services and those who are able to provide them, Chang says.
AI is particularly good at some of the more tedious tasks that radiologists, as humans, are not, Chang says. “As radiologists, we are asked to play games like ‘Where’s Waldo?’ hundreds of times a day. We’re expected to do something we’re not actually wired to do well,” he says. AI, on the other hand, is good at looking for differences in patterns and could help immensely in this and other tedious tasks.
AI is also good at some of the administrative tasks that radiologists may not be, Chang says. For example, AI could be responsible for making sure that a patient with a history of smoking is reminded to follow up findings on his CT lung cancer screening so he doesn’t wait and come back four years later with stage 4 cancer. If AI were to assist radiologists with these administrative tasks, it would leave them more time to do the harder job of interpreting studies and, perhaps, make their job more attractive to medical students, Chang says.
Informed Decision Making
Shamout says that NYU’s success with AI highlights its role “in quickly learning tasks that radiologists are not used to performing, especially during a stressful pandemic.” Thus, she says, AI “has the potential to assist the clinical workforce when there are shortages in radiologists.” Shamout believes that AI technology will inevitably make its way to the clinic—in the foreseeable future. The likelihood of that happening compels radiologists “to look into how they could best collaborate with AI to make the best-informed decisions for patients,” she says. “This will require establishing trust between radiologists and AI. It also will require establishing a feedback loop to record diagnostic errors on both sides, in order to fine-tune the AI system and improve its performance, especially since its learning may reflect biases in the data incurred by human bias.”
Radiologists also should be measuring improvements in diagnostic accuracy when collaborating with AI. “This could mean increasing sensitivity and specificity and reducing unnecessary costly interventions, such as biopsies,” Shamout says. No doubt, administrators will find all these arguments in favor of AI if they show that they improve patient outcomes and decrease costs.
In other fields, ML models are proving valuable in logistics and workflow. Geis says that these types of products should be hugely valuable to the field of radiology, too. “They’re not sexy, but they can have excellent ROI, and they often give good results with more transparent ML methods, such as decision trees,” Geis says. For example, they can identify patterns that humans might not see to estimate capacity and workload more accurately and precisely.
Another example: ML could be used to figure out how to schedule patients for the exact amount of time predicted to do an exam when given information about the patient, such as medical history, reason for study, presence of priors, and other data. And the same information could be used to predict patients who will be no-shows, assisting with scheduling, Geis says.
Geis is aware that he may have a reputation among radiologists as “Debbie Downer,” when it comes to his views on charismatic, exam-reading ML. “But I remain wary of expecting any time soon to see multiple exam-evaluating ML products in widespread practice, at scale, in every hospital,” he says. Before ML achieves more widespread use in radiology, he says, the leaders will have to find ways to improve not only the handling of data shifts but also the development of “new sophisticated systems to manage and monitor complex webs of data inputs and outputs that have to function reliably and consistently.”
In other industries, these handling tasks are known as DevOps or MLOps. “Radiology doesn’t even have systems specifications or job descriptions for who we need to build and operate such systems,” Geis says. However, on a positive note, he adds, the Society for Imaging Informatics in Medicine is beginning to move strongly into this area.
Siegel believes a few things have to happen before there is more widespread adoption of AI in radiology: better integration into existing workflows and the ability to try and then buy and add AI apps on a platform, whether it’s an existing PACS, an independent platform, or a platform associated with speech recognition/report generation. “It is difficult enough to add a single application without a platform, much less potentially dozens of applications,” he notes.
There also needs to be more packaging of applications for clinical diseases, eg, pulmonary, vascular, neurodegenerative, and prostate, he says. Better methods of sharing experiences with algorithms with others would also help. Siegel is somewhat optimistic that these needed improvements are on the way. “These platforms are rapidly emerging and are improving,” he says.
ML and AI can significantly “up the game” for current analytics tools, which are largely descriptive, and help identify more complex patterns and opportunities for improvement, Siegel says. He points out that business analytics are increasingly using ML and AI. Siegel has no doubt that imaging departments will learn to use more advanced analytics techniques, as well. After all, he says, radiologists have been early adopters of technologies and have long been the ones pushing for computer networks, improved speed, larger data archives, and faster imaging processing, as well as improved visualization tools and devices.
— Beth W. Orenstein of Northampton, Pennsylvania, is a freelance medical writer and regular contributor to Radiology Today.