April 30, 2007
CPOE Check-up
By Elizabeth S. Roop
For The Record
Vol. 19 No. 9 P. 16
Nearly ready to make its national debut, a self-assessment tool shows healthcare organizations where their CPOE and EHR systems need work.
It’s been five years since development started on The Leapfrog Group’s Evaluation Tool for Computerized Physician Order Entry (CPOE)/Electronic Health Record (EHR). Finally, funding is the only obstacle to making the Web-based self-assessment tool available for widespread use in hospitals and ambulatory care settings.
“The tool is ready to go; we’re simply trying to identify the mechanism for bringing it to the marketplace,” says Leapfrog CEO Suzanne Delbanco.
Developed for Leapfrog by First Consulting Group (FCG) with funding from the California Healthcare Foundation, Robert Wood Johnson Foundation, and Agency for Healthcare Research and Quality, the testing tool is designed:
• for public reporting, to provide a measure of how far along a hospital or physician practice is using CPOE or the ambulatory EHR to improve medication safety and—in the case of the ambulatory EHR—quality; and
• for the hospital or physician practice, to provide specific feedback about the extent to which the implemented decision support is avoiding medication errors that could harm patients.
“This was a way to do two things: get an objective measure of how far along in the journey of using these tools a hospital’s actual implemented software is but also provide some feedback to the organization so they can continue their ongoing efforts to use the tools in the interest of medication safety,” says FCG research director Jane Metzger, adding that the objective measure is for public reporting, while feedback provides input for ongoing improvement.
“It is a continuum,” she says. “If we are going to get to a state where there are fewer medication errors, then this is an interim look. Are we using the IT resources we have to help us to the extent that we could be using them?”
The evaluation tool and testing standards complement other HIT initiatives, including the Certification Commission for Healthcare Information Technology (CCHIT) certification of vendor products, pay for performance, and the National Quality Forum’s postimplementation survey of hospital-safe practices.
The latter includes several aspects of the EHR, including CPOE, and is now directly linked to Leapfrog’s CPOE standard.
How It Works
The evaluation is conducted by a self-administered test managed by a Web application. It features separate tests for pediatric and adult orders in inpatient and outpatient settings that represent 10 categories of potentially dangerous errors.
Those categories, developed by FCG and an advisory group, including the Institute for Safe Medication Practices, include the following:
1. Therapeutic duplication: medications with therapeutic overlaps with another new or active order, whether it be the same drug, within a drug class, or involve components of combination products;
2. Single and cumulative dose limits: medications with a specified dose that exceeds recommended dose ranges or will result in a cumulative dose that exceeds recommended ranges;
3. Allergies and cross-allergies: medications for which a patient allergy or an allergy to other drugs in the same category has been documented;
4. Contraindicated route of administration: orders specifying a route of administration not appropriate for the identified medication;
5. Drug-drug and drug-food interactions: medications that result in known dangerous interactions when administered in combination with a different medication in a new or existing order or result in interactions in combination with a food or food group;
6. Contraindication/dose limits based on patient diagnosis: medications either contraindicated based on patient diagnosis, or the diagnosis affects appropriate dosing;
7. Contraindication/dose limits based on patient age and weight: medications either contraindicated for the patient based on age and weight or for which age and weight must be considered in appropriate dosing;
8. Contraindication/dose limits based on laboratory studies: medications either contraindicated for the patient based on laboratory studies or for which relevant laboratory results must be considered in appropriate dosing;
9. Contraindication/dose limits based on radiology studies: medications contraindicated for the patient based on interactions with a contrast medium in a recent or ordered radiology study; and
10. Corollary: intervention that requires an associated or secondary order to meet the standard of care.
The system also looks at nuisance alerting and the use of decision support to reduce duplicate laboratory testing, and the ambulatory care tool tests basic health maintenance prompts as well.
The process begins when the facility registers and downloads instructions and a set of test patients to enter into its system. When it’s time to perform the assessment, a set of 30 to 40 test orders is downloaded and entered. The facility then self-reports the results on the Web site to generate a score. The aggregate score is sent to Leapfrog, and the facility receives a more detailed report on its scores by order category. A system that fulfills the Leapfrog standard will intercept at least 50% of common serious prescribing errors.
“This allows them to evaluate their systems in a way that is unprecedented,” says Delbanco. “If a hospital has a system, and they use the evaluation tool and find out they’re not intercepting enough errors because the physicians have asked to have a lot of the alerts turned off, it’s a way of going back to the physician and saying, ‘We need those alerts if we want to protect the patient.’”
The vendor-agnostic tool is intended to test physician usage of the implemented software rather than the software itself. CPOE and EHR systems “are more of a toolbox than a ‘take it out of the box and plug it in’ [product]; there are a lot of decisions, set-up, and testing involved. Just because someone is using CPOE doesn’t mean they’re using any of its tools,” says Metzger. “What the test does is let them gauge at a certain point in time which problems the tools are addressing.”
Unique Challenges
Taking the evaluation tool from concept to reality meant overcoming several challenges, starting with the fact that only approximately 2% to 3% of hospitals had fully implemented CPOE systems at the time development began.
“It was an idea before its time in many ways,” says Delbanco. “When we started working with the idea, there were only a handful of hospitals in the country that had CPOE systems.”
However, there were enough to proceed, which brought the next challenge: Because the tool was meant to be vendor-agnostic, it needed to work on any vendor or legacy product. It also needed to address the lack of standardization concerning how physicians write certain types of orders, particularly complex ones, says Metzger.
“That was one of the objectives of extensive reliability and validity testing—figuring out which orders were not testable and removing them,” she says. “Another [problem] was that both hospitals and physician practices adopt a particular formulary when they set up their systems, and we had to be sure that we had medications that were universally included in formularies.”
They also needed to ensure the actual testing process was not so onerous that hospitals and physician practices would shy away from dedicating the resources to use it.
In the end, through extensive validity and reliability testing and pilot runs at a handful of hospitals and practices with implemented systems, the right balance was found.
Although most tap IT to help with setting up the test patient records, the entire process can be completed within a couple days, says Eric Pifer, MD, chief medical informatics officer for the University of Pennsylvania Health System, which served as a pilot site.
The university originally implemented its CPOE system in early 2004 and tested it with the evaluation tool in mid-2005. Among the problem areas identified by the tool was the system’s ability to intercept drug-drug and drug-allergy interactions.
“It did a good job,” says Pifer. “Most of the things that the test told me I knew, but the degree to which we failed on certain things on the test provided a good impetus to change some things internally and get some resources to do that.
“I had been telling [the administration] for some time prior to using the Leapfrog tool that we had problems with our decision support systems, so they weren’t terribly surprised,” he adds. “There were a few things that were a surprise. For example, we thought we’d get perfect scores on drug interactions and allergy interactions, but we didn’t. It highlighted the fact that even things we thought we were doing well could potentially use some work.”
Pifer says work is underway to address the areas in its CPOE that the evaluation identified as needing improvement, which is ultimately the goal.
“Sites that have actually gone through the process have indicated that they learned things they didn’t know,” says Metzger. “They are working on [their systems] continuously, and this gives them a way to look at where they are and where they may want to focus their efforts next.”
Now, with pilot testing complete, the only remaining challenge is raising the funds necessary for widespread usage, including ongoing support and enhancements.
“The tool is done, but we’re going through the process of…trying to raise some grant funds for its initial launch,” says Delbanco. “Even though the tool is done, it requires continuous maintenance. Also, it’s based on the idea of having hospitals simulate making orders for fictitious patients, and then reporting back how their system does. We need to have new scenarios generated on an ongoing basis so providers can’t ‘game’ the system, and so we can continue to test a broad range of features.”
Setting the Standard
Ultimately, the assessment results will be incorporated into the publicly available results of The Leapfrog Group’s annual hospital survey, which already includes questions about CPOE. The group is also planning to work with other organizations to integrate the ambulatory standards and assessment into physician rating systems.
When that happens, Delbanco says, it will facilitate accurate public reporting by hospitals that want to publicly share that they have CPOE in place as a mechanism to protect patients from medication errors.
The tool will also help foster a better understanding of the difference between order entry and decision support and provide a method for hospitals and physicians to independently evaluate how well the CPOE and EHR systems they’ve invested in are achieving the expected goals.
“Hospitals, and particularly CEOs and financial types in hospitals, don’t yet understand the difference between a well-implemented order entry system and a well-implemented decision support system,” says Pifer. “They feel that if they signed a check for CPOE, they have decision support… What this does is highlight the fact that they are different things, and that engineering the decision support is a separate task that needs to be resourced and focused on appropriately.”
The hope is that the evaluation tool will be used periodically to gauge progress and guide further improvement, although there will be restrictions in place on how often the test can be taken for public reporting.
“This is a small part of the puzzle that Leapfrog has chosen to address,” says Metzger. “But certainly the call of the [Institute of Medicine] and others for bringing information technology into clinical care is based on an expectation that it can provide a significant safety net that isn’t possible without it. This is just a way of looking at progress toward using that safety net effectively.”
— Elizabeth S. Roop is a Tampa, Fla.-based freelance writer specializing in healthcare and HIT.