Imaging Informatics: Structural Challenge — University of Rochester Radiologist Spearheads Effort to Improve Clinical Recommendation Follow-Ups
By Dave Yeager
Radiology Today
Vol. 21 No. 1 P. 5

Follow-up recommendations are one of the most important aspects of a radiologist’s report, yet too often they aren’t heeded in a timely manner or, sometimes, at all. In 2014, Ben Wandtke, MD, MS, the medical staff president and chief of diagnostic imaging at the University of Rochester Thompson Health in Canandaigua, New York, and an associate professor in the department of imaging sciences at the University of Rochester Medical Center in New York, noticed that patients were presenting at Thompson Health with end-stage cancers despite previous recommendations for follow-up; he decided to find out why. What he discovered was a two- to three-year gap between some radiology recommendations and the patient’s next imaging encounter.

“About 50% of the patients at our facility were not getting the follow-up imaging that we recommended, and we really didn’t know why,” Wandtke says. “The system was losing track of these people, and it became our goal to try to solve that problem.”

In an effort to prevent patients from “falling through the cracks” of their system, Thompson Health’s diagnostic imaging department used Microsoft Access to build a database of follow-up recommendations. The types of recommendations that were targeted were those that specified a follow-up date and the type of modality to be used.

Thompson Health began to measure the process in real time to identify precisely when people weren’t receiving follow-up recommendations. This program, called Backstop, determined that recommendations that weren’t acted on within one month of when they were due were at risk of not being performed at all. To improve follow-up rates, they implemented a three-stage intervention workflow.

First, they reached out to patients’ primary care providers, if the patient had one, and re-sent the radiology report with a cover letter explaining why the report was being sent a second time, pointing out that the recommendation hadn’t been addressed. This worked approximately one-third of the time. After another month, if no action was taken, a nurse called the referring physician’s office to find out whether the recommendation had been handled and, if so, how. After one more month, the radiologist who made the recommendation would call the primary care provider to explain why the recommendation was made and why it was important.

With some volunteers from the imaging department and some funding from the hospital’s malpractice insurer, the database began to pay dividends. Within 13 months, there were no new cases of delayed diagnosis. Compliance rose from 47% to 86%, a 74% improvement. In 2017, Wandtke and a colleague published their results in the American Journal of Roentgenology.

Extending the Model
The University of Rochester decided to scale the Backstop program to its entire health system in July 2017, increasing the number of patients tracked ten-fold. The program went from one hospital to six and from five radiologists to 75. Wandtke says the first three to six months were challenging because his team was attempting to educate all of the radiologists about the process while continuing to track recommendations that were being manually identified and flagged by radiologists as they dictated cases.

Initially, only one-quarter of the expected recommendations were being entered into the system, compared with the pilot project. Because the tools that had been developed internally weren’t able to handle the larger volume of recommendations, Wandtke decided to partner with an IT vendor. He was able to apply the natural language processing ability of Nuance’s mPower clinical analytics solution to automatically identify recommendations from unstructured reports, which allowed him to increase by a factor of 2.5 the recommendations entered into the database to track follow-up compliance. Since March 2018, the system has gone from tracking 150 patients per month to more than 500, raising compliance to more than 80% of the goal.

“The ability to automate the follow-up process not only reduces time and effort but also improves the accuracy of the system,” says William Boonn, MD, chief medical information officer at Nuance Communications, and a practicing cardiovascular radiologist at the University of Pennsylvania in Philadelphia. “When you rely on radiologists—who are already busy and overworked and have a ton of work to do—to put in different macros to be able to invoke a follow-up, there will be times when things fall through the cracks.”

Natural language processing also allows notifications to be generated from the database with a few clicks from the radiologist, while the case is being dictated. Other tools automate, track, and manage follow-up recommendations. One clinical navigator with a medical and administrative background is able to manage and track approximately 6,000 recommendations per year.

“We have all of this data, but it hasn’t been organized in a way that it becomes actionable. What Dr. Wandtke’s been able to demonstrate is that we can have the applications understand the follow-up in the unstructured text of the report and put it into a task list that is actionable so a navigator can now see whether a patient actually did have follow-up or that the patient is no longer in the health system,” says Karen Holzberger, senior vice president and general manager of diagnostic services for Nuance. “Right now, we’re throwing nurses and other coordinators at this problem when we have the technology and all the work that’s being done around AI on the processing side to be able to augment their work so they can be more efficient and more accurate.”

Wandtke says the quality of recommendations being made has also steadily increased since Backstop was implemented. The number of patients for whom the system cannot identify appropriate follow-up care has decreased from 45% to 9% as the result of the tracking system, reducing the number of patients at risk for delayed diagnosis by 80%. And, he notes that early intervention not only improves outcomes—it also saves money in the long run.

“We look at recommendation tracking like a screening program, in the sense that recommendations are used primarily for patients who may have an early-stage cancer. By following these patients more closely, we can identify more treatable cancers and give our patients their best chance for a cure,” Wandtke says. “The value of our radiology report, if our recommendation is not followed, is often zero. By spending some resources to follow your recommendations, you can help ensure that your radiology reports have value. And by bringing back more patients for their follow-up care, this is one of those rare quality improvement projects that actually pays for itself.”

— Dave Yeager is the editor of Radiology Today.