September 2018
Assessing Documentation Quality — Making Sense of Best Practice Standards for Report Creation
By Selena Chavis
For The Record
Vol. 30 No. 8 P. 16
In health care, there are many facets to a quality documentation product. There are also many players, each with a significant role.
The last decade has ushered in multiple new models for the creation of health care documentation. Traditional dictation and transcription methods that have dominated the field for years are now accompanied by an increased focus on clinician-created documentation as well as use of front-end and back-end speech recognition supported by editing from health care documentation specialists.
In tandem with these movements, AHIMA and the Association for Healthcare Documentation Integrity (AHDI) determined the need for a unified set of standards for quality assessment that speaks to all models from the standpoint of errors. For these reasons, the organizations combined efforts in 2017 to publish the "Healthcare Documentation Quality Assessment and Management Best Practices" (QA Best Practices), combining the AHDI best practice document of 2010–2011 and the clinician-created documentation quality assurance program of 2014.
"The QA Best Practices provides a comprehensive approach to quality measurement and improvement. Patient safety is at extreme risk without quality documentation of each and every encounter," says Susan Lucci, RHIA, CHPS, CHDS, AHDI-F, senior privacy and security consultant with tw-Security, who notes that patient safety is at the forefront of all major health care initiatives and providing guidance on improving the accuracy of health records is one of the most important steps a provider can take. "When documenting any clinical encounter, quality must be interpreted and measured objectively and consistently if future improvement in documentation is to be achieved," she says.
Dale Kivi, MBA, vice president of business development with FutureNet, points out that the QA Best Practices toolkit specifically includes the need to address quality issues with documents generated solely by physicians, either with direct type, front-end speech recognition, or drop-down menus. The stakes are high, he says, noting that a 2018 study published in the Journal of the American Medical Association found error rates of 7.4% or higher with documents generated using front-end speech recognition.
"The momentum is building to take a closer look at how quality is measured and why there should not be a different measure based on the technology used to generate the reports," Kivi says, adding that clarification has been needed regarding error types and loopholes that he believes are being exploited by some industry vendors. "Specifically, inserted and omitted text are the most common failures associated with front- or back-end speech recognition technology, and they were being ignored under major vendor quality assessment methods."
For M*Modal, Jason Kolinoski, senior vice president of transcription solutions, says the answer is marrying the standards of quality performance with the company's own quality check methodologies to support "accurate, consistent, reliable, and timely clinical documentation."
"It is equally important that the technology platform also be able to measure, monitor, and consistently drive improvements in quality so that clients can verify that these standards are being met and exceeded daily," he explains. "Our clients are not only committed to meeting these industry standards but consistently seek to surpass them."
Minimizing Industry Variation
Patt King, manager of HIM-Medical Transcription for TMC Healthcare and the director of District 1 for AHDI's National Leadership Board, says variances in how AHDI's best practices are followed for quality assurance present significant challenges to health care organizations. "Many vendors, health care organizations, and facilities have their own standards for quality and, in particular, their own standards for the measurement of that quality when auditing the quality of their workforce," she says.
In the updated QA Best Practices toolkit, AHDI allows for some variation by offering two methods of measurement. Additionally, King says that some organizations prefer to classify certain minor errors as educational, with no point values, while other vendors or facilities may assign those same errors a point value. As a result of these different approaches, industry discrepancies are exacerbated.
In terms of quality assurance, documentation standards focus on various types of errors, the most important being critical errors. An updated definition for critical errors was released with the toolkit to better align with current health care industry movements.
King says errors that result in the following conditions can be rendered critical:
• adversely impact patient safety;
• alter the patient's care or treatment;
• adversely impact the coding and billing accuracy;
• result in a HIPAA violation; and
• adversely affect medicolegal outcomes.
"There are many ways health care documentation can be accomplished in the current health care environment, including by traditional dictation and transcription, back-end or front-end speech recognition, typed by the clinicians themselves, or created by a scribe working closely with the clinician," King notes. "In general, however, a critical error is a critical error regardless of how it is made or who makes it."
In-house teams generally follow the rules, Kivi says. However, he believes there is great variation in how vendors follow the standards. "Those who directly offer speech recognition products are known to cut corners in order to deliver quality assessments that suggest compliance with scores of 98 or above, but they ignore inserted and omitted text and apply assessment methodologies that have nothing to do with the recommended AHDI/AHIMA document evaluation process," he says. "When their documents are actually evaluated with the AHDI/AHIMA standards, their scores drop dramatically."
For example, Kivi points to vendor-reported quality scores of 99.2 and 99.7, which are impossible to attain using the QA Best Practices' whole number error values subtracted from 100 scoring method.
King agrees, noting that "if a single report was evaluated using the error value from 100 method used by Company A and then also evaluated based on taking the total errors divided by number of lines audited method used by Company B, the final scores might be different." She suggests quality assurance professionals review the sample QA score sheets provided by the QA Best Practices Supplementary Toolkit Resources for clarification.
While vendor best practices for quality may vary, Kolinoski says M*Modal has taken the unique approach of building its solutions on a single, EHR-integrated, cloud-based clinical documentation platform, thus setting the stage for a high-quality medical record. "A single, cloud-based user speech profile enables doctors to use any workflow from any care setting and any device: real-time front-end speech recognition directly in the EHR, traditional dictation/transcription workflow, partial dictation workflow, or mobile documentation," he explains, adding that care teams can take this a step further by using functionality to get proactive nudges on quality. "Because it's all on a single platform, we are able to maintain the highest level of quality, security, documentation integrity, and stability across all our documentation solutions."
Kristin Wall, CHDS, AHDI-F, senior programs coordinator and editor with AHDI, says the QA Best Practices "promotes transparency of all policies and processes—including error categories and how scoring is done—between an organization's quality assurance staff, health care documentation specialists, and clients, respectively, to help avoid ambiguity and misunderstanding."
Best Practices for Applying the Standards
Tammy Combs, RN, MSN, CDIP, CCS, CCDS, director and lead nurse planner of HIM practice excellence at AHIMA, emphasizes the importance of secondary audits and assessments for ensuring the highest level of quality. That starts with a clear understanding of expectations between health care organizations and vendors along with ongoing validation of quality assessment methods.
"Let's say that a vendor tells their transcriptionists to follow the best practices laid out in this toolkit. An organization contracting with this vendor would want to validate that," she says, pointing to the need to understand the sample used as well as the processes and procedures for completing a quality audit. "A lot of organizations utilize vendors, and vendors are wonderful. But it's always important for organizations to think about doing their own internal audits as well to be sure that what they are receiving is meeting that high quality standard."
Jill Devrick, MPA, AHDI-F, product solutions advisor with 3M Health Information Systems, offers similar advice, suggesting that HIM directors do their homework on the front end to ensure they are on the same page with their vendor partners about acceptable standards. "I recently learned of a facility that was told by a large transcription and speech recognition vendor that 'some kind of' industry standard exists that says it is not necessary to edit health care documentation if the error is minor and does not change the meaning or context of the document," she says. "This claim is inaccurate, so I referred the facility to AHDI's QA Best Practices, which recommends always making appropriate edits to the document, no matter how small."
Devrick adds that although minor edits should be detected and corrected, AHDI does not necessarily recommend assigning point values to minor errors during a quality review because they do not impact patient care. "Errors in spelling, punctuation, and formatting may not affect the comprehension of the documentation, but minor quality issues may call into question the credibility of an organization and its attention to detail in legal and compliance proceedings," she explains. "HIM directors should select vendors that share the same philosophy regarding documentation quality and agree to abide by the facility's standards."
Kolinoski says M*Modal works proactively with clients through its in-house Adoption Services team, made up of HIM experts who understand not only the technology but also optimal clinical systems and workflows. He asks that HIM directors regularly engage with the organization's team to provide early feedback to ensure goals are met. "Good and regular communication is the key, and HIM directors can ensure that vendors get up-to-date and complete organization specifics needed to populate reports," Kolinoski says. "From the vendor's side, it is crucial to engender trust, be super responsive to client needs, and continually adhere to best practices and quality standards."
Kivi says it's critical for HIM directors to require inclusion of the QA Best Practices in their contracts with appropriate rate reduction or contract cancellation clauses for chronic noncompliance. "The beauty of the AHDI/AHIMA standards are they apply equally for all document creation methods, whether it's traditional transcription, with or without the support of back-end speech, physician-centric documentation, or some combination of the two," he says.
Nevertheless, Kivi believes the industry is experiencing a loss of quality assessment efforts at the point of record creation under physician-centric documentation in the EHR. "Front-end speech and EHR vendors suggest quality is inherent to their technology and the physicians' direct participation in the record creation. It's like we all forgot everything we learned with Six Sigma and Lean, where quality assessments belong where errors are introduced," he explains, pointing out that coders, clinical documentation improvement specialists, and technology itself can be driving forces. "HIM needs to reassert themselves at the front end of the process with the use of the AHDI/AHIMA standards to ensure document integrity and quality."
Recently, some organizations have adopted clinician-created documentation integrity auditing—sometimes referred to as editing or analyzing—to address documentation created directly within the EHR. According to King, the process involves an expert health care document specialist evaluating a document without the clinician's voice recording. Instead, document specialists use their skills, experience, and the set of error criteria established by their organization to apply quality assurance standards to health care documentation that may otherwise not be reviewed.
The QA Best Practices toolkit provides everything needed for an organization to update or create their own quality assurance program, according to King. The toolkit can be applied to various platforms and methods of documentation creation. "Traditional dictation and both back-end and front-end speech recognition can be and generally are audited for quality by both vendor organizations and health care facilities, with the potential exception of clinician practices and smaller facilities with a limited workforce," she says.
— Selena Chavis is a Florida-based freelance journalist whose writing appears regularly in various trade and consumer publications, covering everything from corporate and managerial topics to health care and travel.