May 9, 2011
How to Measure CDI Effectiveness
By Annie Macios
For The Record
Vol. 23 No. 9 P. 14
There are ways to tell whether a clinical documentation improvement program is fulfilling its intended purpose.
Clinical documentation improvement (CDI) programs are important to any facility that recognizes the necessity of complete and accurate patient documentation. While each facility’s methodology of implementing a CDI program can vary, certain universal factors contribute to its success, including strong physician buy-in, communication of key data findings, and ongoing review and improvement of the initiative’s purpose and objectives.
Whether you are just starting or have had a program for years, measuring its benefits and building strategies to stay on track will help keep it robust and effective.
Steve Robinson, senior director of CDI solutions at Maxim Health Information Services, says key CDI team members include case management and HIM directors and compliance department representatives—generally whoever is going to take ownership of the project. Physicians should also be directly involved, he says, adding that it takes a group effort to reach maximum effectiveness.
The CDI program at Newton Medical Center (NMC) in Kansas began four years ago with the birth of a clinical documentation excellence task force whose goal was to improve a process that had become too inefficient.
“It [the program] started because physicians would go to administration frustrated because they were being queried by coding postdischarge, which created a problem because it was difficult at that point to remember the patient’s case,” says HIM Director Betty Lanzrath, MA, RHIA. “So with that, we moved to concurrent coding. Now they are able to issue coding queries that are more meaningful to physicians.”
NMC’s drive to improve clinical documentation was also a proactive response to the advent of Medicare severity diagnosis-related groups (MS-DRGs). “The program was originally started for financial reasons. They knew that MS-DRGs were happening and if the documentation didn’t have specificity, we would lose money,” says Lanzrath.
Physician participation is usually a strong indicator of a CDI program’s success. At NMC, participation was gradual. “Each physician is an individual. At first they were not all on board with it, so we began the program with those who were and gradually increased it to include 100% of the physicians,” Lanzrath says.
No doubt physician buy-in is a huge barometer of a program’s success, says Dexter D’Costa, MBBS, MHA, manager of the clinical documentation program at Stanford Hospital & Clinics.
“To tell if a program is effective, you look within the program at the CDI workflow, physician buy-in, query response rates, and query agreement rates. Also look at the level of involvement of physicians in understanding the process of CDI and whether queries are appropriate. The level of involvement tells if it is successful,” he explains, adding that it’s also important for clinicians to work with HIM on DRG reconciliation and to ensure that coding reflects the clinical picture.
It’s equally important for healthcare organizations to come up with their own definition of “effective,” according to D’Costa. “For some facilities, it is revenue. For others, including us, it is quality and risk of mortality. For others, it is a hybrid of case management, quality, and coding,” he says. “There is no right answer, but if you set goals for what is the purpose of your existence and execute them, then you will be successful.”
Chealsea Nather, MS, Stanford’s director of clinical analytics, looks for balance in the facility’s CDI program by examining the queries being performed, checking for areas where there needs to be additional education and working on a solution to enable improvement.
At St. Luke’s Hospital in Duluth, Minn., where the CDI program has been in effect for 10 years, physician acceptance and excellent communication between CDI staff and physicians have combined to help it flourish. “When physicians seek me out for documentation to ensure they capture as much as they can, it shows the physician is involved in the process,” says Heidi Hillstrom, RN, MS, MBA, CCDS, a member of St. Luke’s clinical documentation management team.
In his role as a consultant, Robinson recommends regular reviews of a CDI program. He suggests five on-site follow-up visits be conducted during the first year, followed by one every six months. However, he notes the norm is that only one or two reviews are performed after the first year.
“I recommend facilities put a CDI review in line with the coding audit. If they are doing a coding audit every six months, why not look at the CDI program as well? The CDI program is the feeder of correct information to the coder,” he says.
Measure of Success
Each facility measures different variables based on its goals and objectives to ensure its CDI program is effective.
The elements used to gauge success at NMC include measuring and reporting on total charts reviewed concurrently, how many cases should have been reviewed vs. how many were actually reviewed, the number of queries issued, the number of queries answered within 30 days, and case mix index (CMI) by physician, physician specialty, and hospitalwide.
“We are a small hospital; we have two hospitalists on board who compete to get those numbers up. We need to have the documentation to show physicians what difference documentation makes to CMI, with the data pointing to more specific documentation needed,” Lanzrath says.
At NMC, the recovery audit contractor coordinator compiles CDI information in a database for distribution to a task force consisting of the two hospitalists, a vice president, coding staff, and case management personnel.
NMC keeps physicians aware of the importance of detailed documentation through a variety of strategies, including asking specialists to lead seminars in which they provide colleagues with more details on their specialty. Also, the recovery audit contractor coordinator produces a popular monthly newsletter featuring helpful hints and ideas on how to produce better documentation.
“We used a lot of tools to help physicians learn how to document better,” Lanzrath notes. “We developed 25 coding query templates for specific diseases. Physicians didn’t want handwritten queries, so we also developed a process for queries to meet their needs.”
Lanzrath says it’s challenging to use paper queries while NMC as a whole becomes more reliant on its EMR. “I want to find out how to move to an electronic query, which we haven’t done yet because of the limitation of our current software,” she says, adding that NMC is focused on developing a documentation tool featuring disease-specific templates that prompt physicians to electronically enter progress and discharge notes while using the newly implemented physician order entry system.
D’Costa and his team at Stanford Hospital & Clinics are responsible for examining query trends and patterns regarding to whom they are sent, their content, and correlated outcomes such as severity of illness and risk of mortality. During this process, he works in collaboration with HIM on a regular basis. The team reports the data to senior leadership and medical staff.
“The level of EMR you have—we are a stage 7 HIMSS adoption model—also enhances your ability to pull data,” D’Costa says. However, he notes that even if an organization does not have an EMR, it can still reap the benefits of a CDI program by being able to present physicians with data that demonstrates precisely where their documentation is not up to standards. This clarity can go a long way toward gaining physician buy-in, he notes.
Hillstrom believes the people who are running the CDI program are best equipped to interpret the data because they have a better understanding of how to enter the information being requested. At St. Luke’s, CMI, risk of mortality, and revenue generated from capturing complications and comorbidities (including major ones) are used to determine the program’s effectiveness. The impact of DRGs to support length of stay and utilization of services is also examined. The report is then given to administration for review.
St. Luke’s uses an Excel program that includes all the data needed to measure success. “It begins with the query and continues from the patient to the record to the physician to the coding department. The coding staff reviews the medical records postdischarge and consults the clinician if further clarification is needed,” Hillstrom says. “We have a very close relationship with coding staff; we couldn’t make this work without a strong relationship with them. We are talking day in, day out.”
In addition, monthly meetings are scheduled to address ongoing concerns.
What Doesn’t Work
Lack of physician buy-in tops the list of reasons a CDI program would fail to produce results, according to Lanzrath. To help avoid this problem, NMC holds weekly meetings with leadership at the vice president level that supports the program.
D’Costa says there are many factors that could render a CDI program ineffective, including a failure to create specific goals, unclear staff expectations, poorly defined process/outcome measures, and a lack of communication between not only physicians and staff but also between staff and other departments throughout the facility.
“You may have great data, but if you don’t communicate it effectively, it isn’t helpful,” he says.
Nather stresses the need for institutional support to run a CDI program effectively. “I believe the landscape would look differently here [at Stanford] if the senior leadership didn’t support and see the value of the program,” she says.
To readjust a floundering program, D’Costa believes an organization first needs to be aware of the problem, which is where measuring success comes into play. He also recommends being flexible in finding an alternate solution.
“Validate the problem and that the current process isn’t working and be open to change, knowing there is more than one way to do things if your original plan isn’t quite working,” D’Costa says.
Maintaining a high profile for the program as it evolves is also important. “How many resources do you have as champions for change to make sure the program is sustainable? It is important after two to three years to make sure they don’t lose sight of the program,” D’Costa says.
Hillstrom says a program can go awry if the CDI specialist lacks passion or if buy-in is not obtained from coders, physicians, and administration. “At our facility, to ensure physician buy-in, every new physician goes through orientation on clinical documentation. All residents are met with at the beginning of their residency to establish a relationship. We are part of the orientation curriculum,” she says.
In addition, St. Luke’s CDI staff keeps abreast of the latest recommendations on how best to document a variety of diagnoses. “We look at, learn, talk to the doctors about new recommendations. We talk to the doctors and create our own tip sheets regarding new developments to query doctors on a particular case. We also talk to specialty groups and review new recommendations based on research and current literature,” says Hillstrom, who credits much of St. Luke’s success to physicians who help with not only terminology but also with describing how a diagnosis works physically.
She says the trust shared among physicians, coders, CDI staff, and administration is strong enough that when the CDI program needs to be changed or readjusted to ensure it stays on track, problems can be addressed head on in a cooperative fashion.
“Healthcare is dynamic, so if we are not keeping up with the latest information, it won’t work. Our program is 10 years old, not as big as others, but our administration really supports us completely,” Hillstrom says.
According to Robinson, insufficient staff numbers may be the biggest reason CDI programs fail to garner the desired results. He also stresses the importance of determining whether physician queries are meeting their objectives.
The value of a successful CDI is only going to increase, Robinson says.
“I think CDI is going to become more and more important in light of ICD-10; accurate billing will be even more dependent on the ability to code correctly, which will be more dependent on accurate documentation to code from,” he says.
— Annie Macios is a freelance writer based in Calgary, Alberta, Canada.