Ensemble Approach — Building IT Infrastructure for AI
By Kathy Hardy
Radiology Today
Vol. 20 No. 8 P. 12

The current spotlight on AI tools for radiology is bringing an emphasis on the development and validation of individual applications. Consumers are looking at which applications would be most useful in their practices, often looking for solutions to the “lowest-hanging fruit” of their imaging needs or selecting the few options they can easily incorporate into their workflow. A result of this process of picking and choosing various AI applications is a disjointed network of tools in need of a unifying mechanism. The intelligence that deep learning brings to diagnostics and treatment needs a platform on which to build an IT infrastructure to support this powerful resource.

Eliot Siegel, MD, FACR, FSIIM, a professor and vice chair of the department of diagnostic radiology at the University of Maryland School of Medicine, compares the situation to that of music lovers shopping for songs. They want to download only select songs from a music streaming service, rather than purchase the entire CD and end up with playlists full of songs they don’t want.

“Like in music, radiology departments don’t want to buy an entire package,” Siegel says. “They want to pick and choose best-of-breed solutions. We need a way to do this in radiology. We have PACS, but we don’t have the ability to easily or quickly add one-off contracts with each application vendor. We need a mechanism to consolidate the applications that work best for our individual needs.”

All Together Now …
Experts in AI note a growing number of vendors getting into the space, creating new applications for deep learning. Siegel says that, while the increased input brings new perspectives to the technology, it can also lead to disruptions to the status quo of the imaging community.

“In the past, new technology developments in radiology came from the major modality vendors, such as GE, Siemens, or Philips,” he says. “The introduction of the large number of new AI vendors is disruptive. It adds to the excitement and hype about AI in radiology circles, but also leads to questions about what choices to make.”

Variability among vendors is just one of the changes begat by AI’s growing place in radiology. Another change is in development time. Siegel mentions computer-aided detection, which utilizes a variety of technologies that typically require a high level of expertise and a lengthy development timetable, but still results in quality applications. Now, with deep learning applications, Siegel says the ability to generate useful algorithms directly from imaging databases helps speed up the application development process.

“By shortening the development time, deep learning will result in a democratization of AI and an increased number of applications,” he says. “This makes it easier for more new vendors to get involved. You can still access apps from the major modality vendors and advanced visualization providers but, with deep learning, there is an increased number of opportunities for what the smaller and potentially more nimble AI developers can do. Instead of a small number of large radiology companies with multiple applications, we now have more vendors producing fewer but often more unique and specialized applications.”

For now, many radiologists are using multiple traditional imaging suites to utilize the applications that best serve their needs. This can cause problems with consistency and requires users to learn different interfaces. There are some options in the industry for single support of multiple, varied applications, but Siegel says movement in that direction is slow to come. One-stop mechanisms need to enable applications to work in concert, he says, so users can view images and have the analysis in one place.

“We need an ensemble approach,” Siegel says. “We need a platform that allows us to pick and choose applications and then enables them to work together cooperatively in series or parallel.”

Getting to this point may be challenging. Siegel says next steps pertain to workflow: As radiologists look to adopt new applications, especially for “boutique” use, scalability becomes more difficult.

“We need to look at whether these new platforms will become part of our existing workflow of interpreting studies, or will we need to reinvent our workflow and platforms?” he says. “Interoperability will be a big issue, as will integration into workflow and the ability to scale up. It’s fascinating to see one AI application at a time do well independently. Now, we need to get to the next phase, where we can create a practice around this new workflow.”

Proprietary vs Open Source
Creating platforms that enable AI applications to work together is a goal in building the right IT infrastructure. With that, however, come concerns about whether these platforms comprise proprietary components. Bradley Erickson, MD, PhD, a radiologist with the Mayo Clinic in Rochester, Minnesota, sees the exclusivity of proprietary systems as a concern with regard to adoption. When radiologists become locked into using the same vendor for all components of a system, it limits the potential for changing components as needed. He recounts similar concerns when PACS was introduced.

“When PACS started, it was proprietary,” Erickson says. “When you needed to make a change, you had to completely change the entire system. You couldn’t just change one component. That’s why you want to use a vender-neutral system. You want to be able to bring in the best of all the AI tools. It’s not a one-size-fits-all.”

Despite the desire to keep the platform open to a variety of AI applications, Siegel notes that, at least in its infancy, the development of proprietary systems could lead to the development of standards for interoperability.

“If you create a sophisticated interface to integrate various AI applications today, it would most likely have to be proprietary,” Siegel says. “One of those proprietary systems might achieve a critical mass similar to the case with video equipment such as VCRs and television broadcast resolution. So, even though it starts out as a proprietary system, a highly successful one could become or at least inspire the de facto standard for AI interfaces.”

Erickson’s approach is a vendor-neutral system that brings together as much data about the patient as possible. In the course of researching aspects of computer-aided diagnosis and the use of computer technologies to extract information from medical images, he began developing a system to promote team science. This system, which brings together previous images and other clinical support data, was first introduced and used at the Mayo Clinic before being introduced commercially as FlowSIGMA.

“Like a team approach in medical care, FlowSIGMA pulls together multiple series and prior exams,” he says. “There can be gaps in diagnosis if you’re not looking at all the relevant patient data. As more AI tools are being built, we see a need to incorporate everything together.”

FlowSIGMA, for which Erickson serves as CMO, started as a system to enable an algorithm for identifying changes in brain tumor patients using MRIs. This process is more complex than simple routing or forwarding, given that it requires recognition that several specific series be present to constitute the specific exam and querying the archive to determine that a prior exam of this special type is available. The technology also “time boxes,” meaning that if the analytic process takes longer than its prespecified amount of time, an error-handling workflow can be executed.

“If one of the AI tools is down, you wouldn’t know it,” Erickson says. “You have PACS, RIS, and EMR that communicate together. AI is the fourth leg of the communication process and provides the infrastructure that helps them communicate together. The capacity to detect errors is needed, but it’s not understood how to address that need. That’s a problem that can be addressed with the right IT infrastructure.”

Since its introduction, the FlowSIGMA technology has been used for several other workflows, including high-throughput tools such as CT denoising. Erickson says the system is flexible and its modular design allows for integration with other industry-standard tools.

Workflow
When considering AI integration into workflow, Erickson says there are considerations such as graphics processing units (GPUs) to accelerate the creation of images for output to a display device and central processing units (CPUs) to carry out the instructions for performing arithmetic and other aspects of algorithm building. CPUs require less infrastructure than GPUs.

“When people think about what’s needed to integrate AI into workflow, they think about the process of training the system to recognize the data needed to complete the analysis,” Erickson says. “They’re thinking of the power it takes to train the network. The computing required for the inference aspect of integrating AI is low.”

He sees workflow as the biggest challenge, yet this aspect of the process is often underestimated.

“Most AI tools currently work with single images,” Erickson says. “But AI is better with more data. With advancements in imaging technology, we have more data we can include. When I was going through my radiology training, we were told that a patient’s previous exams were your best friend. The same can be said for AI.”

In addition to images, Erickson says the infrastructure to support a successful AI system should also be able to incorporate nonimaging data, such as patient history and current medications. However, “systems today don’t adequately combine all this previous information to optimize tools,” he says.

Siegel agrees, noting that AI has limitations when it comes to certain areas of radiology, such as its use in oncology. The current AI paradigm is to address a single task when reviewing a medical imaging study. While this works for diagnostic purposes, it doesn’t address a major radiologist responsibility in oncology: determining whether there has been a significant change over time with or without a therapeutic intervention. In that case, the process isn’t to diagnose but to track changes in tumors, to see whether the patient is improving with treatment. Radiologists need to look at prior exams and current exams to quantify improvement.

“Having the ability to access imaging and nonimaging applications, working in concert, is a more sophisticated way of using data in the course of diagnostics or triage,” Siegel says. “It’s a best-in-practice approach, and what’s best in practice is what’s best for patients. Making the decision to use a platform that consolidates all the relevant data is what’s best for workflow, best for efficiency, and best for patient care.”

Siegel would also like to see various AI systems work together cooperatively, just as a panel of experts would collaborate.

“You need more than one AI system,” he says. “With that, you also need a platform on which radiologists can build the AI tools that best fit their needs. With more tools that will need more data, there will be more recognition and greater adoption of a more sophisticated system in the near future.”

— Kathy Hardy is a freelance writer based in Phoenixville, Pennsylvania. She is a frequent contributor to Radiology Today.