The unprecedented level of detail in EMR clinical data opens new possibilities for defining clinical quality measures in more clinically meaningful ways. However, more detailed data can cause difficulties in ensuring that data across institutions are comparable, according to an article in the online March edition of the Journal of Medical Informatics Association.
Michael G. Kahn, MD, department of pediatrics at the University of Colorado in Denver and Daksha Ranade from the department of clinical informatics at The Children’s Hospital in Aurora, Colo., sought to examine the impact of billing and clinical data extracted from an EMR system on the calculation of an adverse drug event (ADE) quality measure approved for use in The Joint Commission’s ORYX program, a mandatory national hospital quality reporting system.
The Child Health Corporation of America's (CHCA) “Use of Rescue Agents—ADE Trigger” quality measure uses medication billing data from 48 nonprofit free-standing children’s hospitals in the U.S. contained in the Pediatric Health Information Systems (PHIS) data warehouse and was used to create The Joint Commission-approved quality measure, according to the authors.
“Using a similar query, we calculated the quality measure using PHIS plus four data sources extracted from our EMR system:…Four versions of the ‘Use of Rescue Agent – ADE Trigger’ quality measure’s numerator and denominator events were developed as SQL-based queries against the EPIC EMR system:
- Events defined using medication order codes;
- Events defined using medication charge codes;
- Event defined by intersecting medication orders with medication charges; and
- Events defined using medication administration codes.
In calendar year 2008, 15,662 children were discharged from The Children’s Hospital, according to the article. Of these, 5,178 discharges met the inclusion and denominator definitions using orders placed, 4,747 met the definitions using medication charged, 4,150 using orders charged and 4,116 using medication administrations, according to the report.
“In attempting to understand the source of the observed differences, our analysis has shown that for this quality measure, differences in denominators, numerators and potential adverse event rates reflect a combination of common clinical practices and The Children’s Hospital-unique administrative, billing, documentation, data capture and workflow practice,” the authors wrote. “A difference based on The Children’s Hospital-unique workflows is more insidious to detect, more difficult to adjust and highly unlikely to be similar across institutions.”
“More detailed clinical information may result in quality measures that are not comparable across institutions due to institution-specific workflow and differences that are exposed using EMR-derived data,” the authors stated.
“Constant validation and re-validation of measures, using laborious multi-institutional manual chart reviews, must be done when a new source of clinical data becomes available. Detailed understanding of specific components of the EMRs, electronic documentation and institutional workflows in the interpretation of those data is necessary,” the authors concluded.