Analytics — From Data to Decision
By David Yeager
Vol. 14 No. 11 P. 20
Radiology typically houses many silos of information. Combining those databases and analyzing them collectively to inform decision making provides a complicated challenge for informatics experts.
The growing volume of medical data represents both a challenge and an opportunity for radiology. As health care data become increasingly digitized, more aspects of patient encounters are being quantified than ever before, offering new possibilities for improvement in practice management and patient care.
The problem is the human brain still has roughly the same capacity that it had before computers were invented. Without some way to aggregate and digest the data, developing useful metrics is difficult at best. Analytic tools that can collect and sort through large chunks of data to provide useful insight to users are slowly making their way into medical practice, but there’s a long way to go before the value of those data is widely realized.
“Today, we have all of this great data being collected, but very little of it is being utilized,” says Paul G. Nagy, PhD, FSIIM, CIIP, an associate professor and the director of quality in the Russell H. Morgan department of radiology and radiological sciences at Johns Hopkins University. “So I definitely think the call to arms is that radiology departments are sitting on a gold mine of data, but they don’t understand what’s actually going on in their department, and they can’t predict what’s going to happen or how some outside changes are going to affect it.”
Tools that present data in a visually understandable format, often called dashboards, can help uncover underlying data trends. Nagy says many radiology departments and practices either are adopting or considering some form of dashboard. Although there are a few commercially available solutions, radiology groups and departments sometimes build their own, using existing business intelligence tools to develop their own practice-specific metrics. Dashboards are used to measure efficiency and productivity or to understand complex business operations. The metrics that can be used are limited only by the imagination, but Nagy says they tend to fall into three categories:
• quality (how well a function is performed);
• productivity (how often it is performed); and
• performance (how fast it is performed).
Operational efficiency, such as the time between when a patient is scheduled and when he or she is scanned or between when the patient is scanned and when a report is finalized and delivered to the referring physician, is one goal of analytic measurement. Other measures, such as scanner utilization and who’s reading which cases, also provide useful information.
Nagy says some practices are beginning to incorporate customer relations management dashboards. He says practices and hospitals want to track which physicians order what types of tests and how many. They want to apply appropriateness criteria to those orders. Measuring patient wait times throughout the entire encounter can identify gaps in the department or facility’s workflow.
Nagy says analytics also enable facilities and departments to begin developing their own patient satisfaction surveys. Many people currently rely on the Hospital Care Quality Information from the Consumer Perspective Survey, which Nagy says is difficult to extrapolate radiology information from. These metrics help ensure that referrers are getting the level of service they expect, but he’s only seen a few so far.
Analytics can improve patient care, too. Radiation dose-tracking tools can help ensure that patients are being given the smallest radiation dose necessary. Tracking recalls for nondiagnostic procedures may help uncover areas where protocols need to be rethought or physicians need more education.
The wide variety of metrics underscores the potential benefits of dashboards and other analytic tools, but it also leads to a more basic question: What should be measured? Data can be collected on patient flow, technologists, radiologists, dose parameters, report accuracy, and myriad other aspects of health care delivery. Tessa S. Cook, MD, PhD, a cardiovascular imaging fellow in the radiology department at the Hospital of the University of Pennsylvania, says it’s important to consider how the data will be used. For a typical clinical radiology practice, she says patient flow and provider metrics provide the most value.
Cook, who will jointly moderate a session about the use of business analytics in radiology at RSNA 2013 with Nagy and Katherine P. Andriole, PhD, says patient flow metrics are particularly useful because patient flow is a central component of patient satisfaction and quality of care. As the health care system moves toward value-based care, those two factors will grow increasingly important. Identifying inefficiencies in patient flow or areas of patient discomfort can improve quality throughout the entire department and, Cook points out, the Centers for Medicare & Medicaid Services and other agencies already have begun looking at ways to incorporate patient satisfaction measures into reimbursement formulas.
Measuring Radiologists’ Work
Provider metrics are trickier. Turnaround time and study quality are useful for evaluating technologists, but radiologists’ value lies in the reports. Gauging radiologists’ effectiveness is a more interpretive exercise.
“Radiologist metrics are more challenging,” Cook says. “What is the right metric for a radiologist? We really don’t have them. We have things like turnaround time; it’s very easy to track patient flow through the department. Scanner utilization, complications, those things are easier to track, but the question of an appropriate metric for a radiologist is still up in the air.”
What Are You Lookin’ At?
How best to measure radiologists is a question that David Miller, MD, a radiologist at South Jersey Radiology Associates (SJRA), has wrestled with for some time. Miller has administrative responsibilities within his practice, a large imaging chain in the Philadelphia area. SJRA uses Montage Search and Analytics, which is integrated with Nuance’s PowerScribe 360/Reporting application, to gather practice metrics.
Miller says analytics have enabled more consistent peer review of radiology reports. To begin, the software retrospectively flags left/right and male/female discrepancies. He says this is important because referring physicians easily miss such discrepancies. Miller says the default templates used with structured reporting and radiologists’ time constraints mainly cause these types of errors, but most SJRA members have become more mindful of their reporting habits since these errors have been highlighted, leading to fewer discrepancies.
Critical results management is another significant area that can benefit from analytics. Miller says SJRA doesn’t use analytics for critical results reporting, but a clinician or an administrator can identify criteria for reports that require documented person-to-person follow-up and run a search on them. For example, a list of all pneumothorax findings can be checked to see whether the report was called in and who received it. If there’s no confirmation of receipt, the practice can follow up.
Analytics also can allow for a deeper look at report quality. In addition to allowing the practice to apply ACR criteria to reports, Miller and his colleagues can look at subsets of reports and see whether they are clear, concise, and include appropriate, well-defined recommendations. They also can research specific cases or conditions and compare the images. Miller says the ability to design and save reports has made data analysis much more efficient, but it requires some trial and error to set up reports that provide the information you want. “You can design a report and save it but, until you determine what you want, it’s hard to know whether you’re asking the right questions,” he says. “The data’s in there. It’s just a matter of asking the question to get it out, and that takes time.”
Miller says analytics have not only helped improve report quality, but they’ve also helped drive practice decision making. In addition to chronicling how many exams each radiologist is reading, relative value units can be assigned to each exam to develop a more complete picture of what each radiologist is doing. Opinions vary on how to share such information in a practice. Miller says that, in his experience, publishing the numbers tends to cause divisiveness in a practice.
Analytics can provide marketing information. Referral patterns can be tracked to help identify trends. For example, if a particular group that used to send CTs and PET scans suddenly stops sending CTs, the practice can reach out to find out why.
Geographic trends also can be examined. Management reports can be created by referrers’ office location, the number of exams that each referrer is sending, and the types of exams to get a clearer picture of where business is coming from and where it can be grown. Although these types of calculations aren’t new, there hasn’t always been data to support them. “These are things people have done historically, but they didn’t always have the data to back up the decisions or they’d have to do it manually,” Miller says. “Now, you write the search, and the computer executes it for you. And that’s powerful. Over time, we’ll have additional trending data, which will also be beneficial. That will really help us decide if we are making the correct decisions.”
Moving forward, Miller hopes to use analytics prospectively, but the reports still are retrospective at this point. He says it’s challenging to anticipate future needs, but the existing tools may help point him in the right direction. There are a couple of items on his wish list, however, such as real-time alerts for report discrepancies and automatic searches for examples of uncommon diagnoses, that he envisions being available down the road.
The question of how much can be done with analytics in radiology is a hot one at the moment. Rasu B. Shrestha, MD, MBA, vice president of medical IT at the University of Pittsburgh Medical Center, says case-centric applications, such as the ones being used in radiology today, generally are how analytics work their way into an industry. He believes it’s time for radiology to take a platform-centric approach to radiology informatics, though, and “elevate above the application.”
“You’re limited by the silos that you have. You have an RIS platform; you have a PACS system; you have a 3D imaging solution; you may have a dose-monitoring tool—all of these siloed information systems at the back end,” Shrestha says. “The challenge is how does radiology start thinking big beyond the application level? I think we’re not quite there yet. As a broader community, radiology is still pretty much hung up on the fact that it’s nice for us to have these specific dashboards running ad hoc inquiries on top of our RIS databases. But I do think, with the focus now on value-based imaging, there is a burning need for us to really look at analytics as a whole scenario. My dashboards are changing. My metrics are changing.”
As value-based imaging becomes a bigger driver of radiology services, Shrestha, who is moderating a couple analytics sessions at RSNA 2013, says what was good enough yesterday won’t necessarily be good enough tomorrow. He says radiology should work with other areas of HIT, some of which already use enterprise-level data, to broaden its vision of analytics. Provider organizations also may have ongoing data initiatives that radiology can get involved with. If they don’t, he says it’s time to start asking for them.
Another way that radiology departments and practices can begin developing a more comprehensive analytics strategy is to communicate their needs to their vendors. Shrestha recommends asking for access to the databases that underlie RIS, PACS, and 3D imaging and finding out whether open application programming interfaces could be exposed to a vendor’s prebuilt analytics package. Unless customers demand access, vendors will be reluctant to provide it.
Moving forward, Shrestha sees a need for all of the data in radiology’s application systems to be stored in a common data warehouse. From there, practices and departments can work their way up to decision-support capabilities and, ultimately, personalized medicine, such as identifying effective radiation dose and appropriateness criteria, enabling computerized physician order entry, and tying radiology systems back to the EMR. Enterprise-level analytics also would allow clinicians to define best practices and use population-based modeling to predict outcomes.
Nagy agrees that linking to other systems is the next logical step for radiology, and he sees the EMR as a “treasure trove” of valuable information. Over the next five years, he expects to see significant innovation allowing radiology to tailor its services to specific populations and better define the value of imaging. But, he adds, it will take collaboration with IT and support from management to make it happen.
“I think analytics has really proven that it can transform many different industries, and it’s usually IT with a combination of analytics that helps transform the industry. It’s usually not IT alone,” he says. “With analytics, there’s got to be a management revolution to be able to harness that data, empower the front line with the data that’s needed to improve operations, and build a better culture for safety and quality.”
— David Yeager is a freelance writer based in Royersford, Pennsylvania. He writes primarily about imaging informatics topics for Radiology Today.