Putting the AI in Radiology
By Keith Loria
Radiology Today
Vol. 19 No. 1 P. 10

How artificial intelligence is being used now and where it's headed.

Talk of artificial intelligence (AI) has been running rampant in radiology circles. Sometimes referred to as machine learning or deep learning, AI, many believe, can and will optimize radiologists' workflows, facilitate quantitative radiology, and assist in discovering genomic markers.

Bibb Allen, MD, FACR, CMO of the ACR's Data Science Institute (DSI), notes, "AI-lite" is already being used in radiology in a number of ways, such as computer-aided detection for cancer, auto-segmentation of organs in 3D postprocessing, natural language processing to facilitate critical results reporting, consultation of best guidelines for recommendations, and quantification and kinetics in postprocessing. However, deep learning algorithms have not made their way into widespread clinical use.

"We believe that AI is poised to significantly increase the value radiology professionals are able to provide their patients," Allen says. "Adding information acquired from AI algorithms to our reporting and workflow can significantly improve patient care. While AI for imaging will not come all at once, early adopters of AI in their practices will be ready to be future leaders in health care."

Still, some radiologists say that there is too much focus on computers replacing radiologists, which leads to skepticism about AI implementation. Despite these misgivings, most see the real focus as being how computers can improve patient care, and they believe that the rest will work itself out.

Allen says those who believe AI is poised to replace radiologists don't fully recognize the value these tools may bring in helping radiologists coordinate and integrate information, identify patients for screening examinations, prioritize patients for immediate interpretation, standardize reporting, and characterize diseases.

"Today, AI is being used for equipment maintenance predictions and building hanging protocols, but, very soon, it will be used as an add-on option for equipment such as CT and MR," says Raym Geis, MD, FACR, senior scientist at DSI and an assistant clinical professor of radiology at the University of Colorado School of Medicine in Fort Collins. "AI can find patterns in data that humans cannot see. This applies to image data, nonimage data such as predicting patient no-shows, or improving workflow."

Paul Chang, MD, FSIIM, a professor of radiology, vice chair of radiology informatics, medical director of pathology informatics, and medical director of enterprise imaging at the University of Chicago, says AI has been popping up in discussions about the industry for decades. Now, however, it is creating a true buzz because of cost-effective hardware acceleration.

"Back when I was at Stanford decades ago, we were doing perceptrons with this gradient descent, and there was something called the 'AI Winter' in the '70s and '80s; it wasn't so much that we didn't understand what needed to be done, but we couldn't, in an economic, scalable way, do it," Chang says. "Today, everyone thinks they can do AI because it's cheap. You download TensorFlow, you get a few NVIDIA cards, and now you're a data scientist and it's easier to do."

Machine Learning vs Deep Learning
Bradley J. Erickson, MD, PhD, an associate professor of biochemistry and molecular biology at the Mayo Clinic College of Medicine in Rochester, Minnesota, notes that AI is a broad field of many technologies for training computers to behave like intelligent beings. Machine learning—algorithms that focus on recognizing data patterns—is based on training data sets that include several examples and an answer—for example, images labeled "cancer" or "not cancer"—then, when given a new image, the algorithm makes a prediction such as "cancer" or "no cancer." Deep learning is a more advanced type of AI.

"Deep learning is the hot new technology that is a more specific form of machine learning, with one major difference being that we don't have to calculate the important features in the examples that it should use for making decisions," Erickson says. "Until recently, all AI applications were machine learning."

Machine learning applications were utilized in screening exams such as mammography, chest CT, and colonography. In addition to those detection tasks, some applications also aided in determining whether a lesion was cancerous.
"Now, AI is also helping with efficiency by automatically delineating the cardiac chambers on cardiac CT, making measurement of heart function more efficient and more reproducible," Erickson says. "AI can identify patterns in images with high reliability and may find patterns that are more complex than humans can find."

Pros of AI
In Chang's opinion, deep learning is neither a horrible threat nor a promised savior.
"It's going to be like any other appropriately consumed technology that's going to allow us to redefine what we do, move up the value chain, enhance our value proposition, and improve both efficiency and quality," he says.

According to experts, the benefits of AI for radiology are numerous.

"It can reduce workload by doing tedious tasks like segmenting structures. That can then enable more quantitative imaging, which most believe will improve the 'product' of radiology," Erickson says. "It can also help to detect lesions that may be subtle, which can be particularly useful when the radiologist is tired or distracted. Finally, we and others are showing that it can find information in images that is not perceived by humans—things like molecular markers in tumors."

As an example of AI in action, Allen says to imagine a stroke protocol patient from the emergency department (ED).

"The CT scanner has a brain hemorrhage detector built into its display software and is able to immediately notify the team that there is a hemorrhage," he says. "At that point, the radiologist can confer with the ED physician and other clinical team members so that CT angiography could be performed while the patient was still on the table, enhancing workflow and efficiency for the patients."

Implementation Challenges
Some of the biggest changes in radiology will come from applying AI to workflow and protocols. One challenge is that developers of AI algorithms don't always have a strong medical background or understanding of physician workflow.

Also, while most believe that image analysis will come in the next few years, it remains difficult to build a product because of disparate data input problems. According to Geis, making the financial case for image analysis AI is complex because it is difficult to define a business plan for a narrow AI product that may solve one clinical question on one modality.

"There may be a pricing disparity between what customers will pay [and the costs involved]," Geis says, in light of "the technical complexity to produce a viable product that analyzes disparate data from many different types of machines and even more different protocols," while achieving the objective of fitting the radiologists' workflow.

Japanese researchers recently conducted a study that used AI to identify and analyze polyps found during a colonoscopy. The analysis took less than one second per polyp. Overall, 306 polyps were assessed in real time by using the AI-assisted system, providing a sensitivity of 94%, specificity of 79%, accuracy of 86%, and positive and negative predictive values of 79% and 93% respectively, in identifying neoplastic changes.

Allen questions what 86% accuracy means for patients. Would it really mean that physicians would not do a biopsy if there was "only" a 14% chance of cancer in the polyp? What would they do if a breast lesion had a 14% chance of being malignant? Of course, they would still do a biopsy, he says.

"The point of that analogy is to show that use cases for AI that just assess risk of a specific disease may not be as good as those that populate our existing guidelines and recommendations," Allen says. "While the accuracy of AI in the study from Japan of colon polyps is certainly impressive, for use in practice, the end points need to be different, and that's where physician input into the use case development process comes in."

It is up to physicians to work with regulators so that AI will be safe and effective for patients, he adds.

Use Cases
Over the next five years, Erickson believes, AI-assisted work efficiencies will be the most immediate benefit because they will not require FDA clearance, nor will there be a need to get the Centers for Medicare & Medicaid Services or other third-party payers to pay for it.

"The benefits will be clear, and the hurdles relatively low," Erickson says. "The next wave will be the diagnostic tools. Of course, some already exist, but tools that measure things more reliably or that can find molecular markers will tremendously improve patient care, and [patient care] will drive their use."

Chang believes that by 2020 the industry will start to see some of the early winners—certain identified use cases that actually move the needle.

"You have to have validated data sets to train [the algorithms], and so the use cases now are just being driven by data availability, not by cases that people care about. No one cares about bone age," Chang says. "A good example of that is a lot of people are using deep learning for breasts for mammography, and the reason why they're doing it is because those are the only data that are actually available that have proof. Eventually, we will have done what I hope we do, and that is improve our IT infrastructure like other industries to support generalized big data problems, and then you'll see use cases that are actually compelling."

The ACR believes that the most effective use of AI in health care will be around narrow use cases.

"If these use cases are developed in coordination with input from physicians, so that the inputs and outputs can be appropriately defined, AI will significantly improve that care we are able to provide our patients as part of our clinical workflow in disease characterization, disease detection, and standardization of reporting," Allen says. "It will be an evolution built on successful development of AI use cases. Clinical practice integration in the long term will include linking multiple use cases together to solve increasingly challenging problems."

With that in mind, in the years ahead, the ACR DSI has several goals: to define clinically relevant use cases for the development of AI algorithms in medical imaging, IR, and radiation oncology; to establish a methodology and provide tools and metrics for creating algorithm training, testing, and validation data sets around these use cases; to develop standardized pathways for implementing AI algorithms in clinical practice; to create opportunities for monitoring the effectiveness of AI algorithms in clinical practice; and to address the regulatory, legal, and ethical issues associated with AI in medical imaging, IR, and radiation oncology.

"For the practicing radiologist, 'investing' means to watch for upgrades to modalities and reporting software that incorporate AI modules so they can include AI tools in their practices," Allen says. "I don't believe that machines will be ready to be physicians any time soon, nor do I believe our society will want them to. While we all know an autopilot can fly an airplane, few of us would want to fly on a plane without a pilot. Yet, at the same time, we would demand that the safety features the autopilot software brings to our pilots be developed and maintained."

Allen sees the same for AI in medicine.

"AI will augment the care we provide," he says. "Will it, over time, change the way we practice? Probably, but most likely in a way that will be good for our professions. Early adopters will have a head start in making that transition."

— Keith Loria is a freelance writer based in Oakton, Virginia.