Making CPOE Work
By David Yeager
Vol. 13 No. 7 P. 6
Many in healthcare hope informatics will improve the efficiency of delivering medical care. For example, evidence-based criteria for ordering imaging exams may help clinicians choose more appropriate tests, which could lead to better patient care and save money. Ordering those tests electronically could reduce duplicate testing, reduce the number of errors related to patient demographics, and save paper.
However, because most medical institutions and facilities are in the early phases of adopting informatics tools such as this, there isn’t a large base of evidence to determine how well they work. A study that measured the adoption rate and meaningful use of computerized physician order entry (CPOE) and clinical decision support for medical imaging, which appeared in the February issue of the Journal of the American College of Radiology, provides some useful data.
The study, which was conducted at Brigham and Women’s Hospital (BWH) in Boston, covered the 10-year period from 2000 to 2010. Just over 4 million imaging studies were performed during the study period. Meaningful use of electronically created studies increased from 0.4% to 61.9%, and the rate of electronically signed studies rose from 0.4% to 92.2%. CPOE adoption went from 0.5% to 94.6%. The highest rates of electronically created studies were in the emergency department and inpatient settings. The lowest rate was in the surgery department.
Clinical decision support is embedded in a Web-enabled imaging CPOE system. It relies on a database of peer-reviewed studies on the utility of various imaging exams. The decision-support component calculates the pretest probability that a particular exam will be useful for making a diagnosis. If the probability falls outside of a prescribed range, other exams are considered.
“There are certain imaging studies with specific indications that have been well studied—for example, CT of the head in minor head injuries—and there have been multiple reports on what types of symptoms and the settings in which that imaging study is needed,” says Ivan K. Ip, MD, MPH, a fellow at BWH’s Center for Evidence-Based Imaging and lead study author. “So those are the things that we targeted, decision support based on existing evidence to guide us.”
BWH developed the CPOE/decision-support system in-house over several years, beginning with pilot studies in 1998. In 2002, it began to phase in the system, first in outpatient areas and then inpatient areas and the emergency department. By 2007, the system was fully installed.
Since workflow concerns are often what determines a system’s success or failure, BWH has made it a priority to incorporate user feedback into the system. Features that improve usability, intuitiveness, and screen presentation were added, and minimizing the number of mouse clicks also proved to be important. Ip believes that the surgery department’s lower rate of adoption is related to its workflow being somewhat different from that of internists, and features have been added to address surgery’s usability issues. Additional features that have been added at users’ requests are a mobile app that provides clinicians with reminders about signatures and remote access capability.
An interesting result of CPOE/decision-support adoption is that high-cost imaging utilization has dropped by approximately 12%; Ip says Massachusetts General Hospital, which has also implemented a CPOE/decision-support system, has also seen a decrease in this type of imaging. Along with the system, BWH included accountability tools that address ordering variability among providers. These tools include quarterly practice variation reports related to various practices and academic detailing of outlying results. Although the reduction in high-cost imaging was a welcome development, it was not BWH’s goal.
“Rather than just looking at reducing the number of studies, per se, we wanted to ensure that we improved the quality of how we performed imaging,” Ip says. “We also wanted to make sure that there was appropriateness so the right imaging studies were performed at the right time.”
It’s a little early to tell whether other facilities would achieve similar rates of adoption and efficiency, but Ip believes they could. The key, he says, is to focus on the end users’ experience. Even small community hospitals with limited resources could benefit from a CPOE/decision-support system because, once a system like this is in place, the maintenance costs are low.
In addition, as more facilities adopt similar systems, healthcare organizations of all sizes will benefit from the accumulated knowledge of those who have already taken the plunge. Instead of five years, Ip believes the time to full implementation of CPOE/decision-support systems may eventually be reduced to one to three years. For healthcare facilities that are considering it, Ip says they should start by taking a comprehensive look at how such a system would be used throughout the organization.
“I would say get as many people involved as possible at the get-go,” Ip says. “[Include] people from different departments, people from different leadership groups, nurses, doctors, any end users—get them involved in the process as much as possible and get their feedback. Every institution’s workflow is different, and every provider’s workflow is different, so get him or her involved at the start. The implementation can be tailored to the institution. If things are not working and people are not using it, then make changes based on that.”
— David Yeager is a freelance writer and editor based in Royersford, Pennsylvania. He is a frequent contributor to Radiology Today.