KPIs: Measuring Quality & Standardizing Success

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon
 - multiple charts
Radiology practices across the U.S. face mounting pressure to improve quality and performance. However, tackling such intangibles presents a bit of a mystery. Key performance indicators (KPIs) provide a means to measure quality and offer a methodical recipe for quantifying and improving performance.

The basic KPI formula is simple:
  1. Pick some problem areas, such as imaging utilization or report turnaround, where performance is below expectation.
  2. Compare performance to the same period in the previous year and compare the last two years cumulatively.
  3. Use a dashboard to display this information to staff, informing and soliciting feedback from all relevant stakeholders to identify sources of sluggish performance.
  4. Develop and publicize feasible performance goals, measuring and refining performance along the way.

Models abound

The lack of “awareness of the [performance] problem is usually the biggest problem,” explains Paul G. Nagy, PhD, director of quality and informatics research at the University of Maryland. Visuals are key.

KPIs can be presented as one-page scorecards illustrating performance for each KPI relative to goals or as real-time computer programs that alert IT and clinical managers to changes in utilization rates, falling productivity, extended report wait times or phone holding times.

The function of a dashboard, explains David Waldron, CEO of Traction Business Development in Fallston, Md., “is to display the key numbers that are necessary to put the business on the track to success. Dashboards give ... the key bits of information you need to be able to make management decisions in running your business.”

Waldron touts dashboards as “great communication tools, to explain finance to staff who has come up through the trenches of the medical system, often with little formal business training. Instead of their eyes glazing over, you show them the pie charts, the graphs and tables and everyone gets on the same page. They’re able to grasp the information and draw the conclusions necessary to improve performance.”

Simple challenges often provoke KPI initiatives. Customer complaints prompted successful KPI programs at Anne Arundel Diagnostics Imaging in Annapolis, Md., and Riverside Radiology and Interventional Associates in Columbus, Ohio.

“We knew something was wrong. There were parts of the day when radiologists were overwhelmed, and times when the doctors had nothing to do, when we had too many people staffed,” remembers Ron Hosenfeld, CIO of Riverside Radiology. The practice started measuring imaging utilization and staffing throughout the day, while also paying attention to report turnaround time and radiologists’ performance. “Initially, the staffing and study volume curves would be shifted to the right or left of each other, either creating overstaffing or understaffing. We’re now able to keep the curves right on top of one another.”

“The very first balanced scorecard we did,” recalls Karen Scott, director of Anne Arundel Diagnostics Imaging, “was on a single sheet of paper, selecting very basic goals that we knew were achievable.” Scott counsels other practices to start simple. “Don’t overdo it, pick some goals that are a little bit of a stretch but that you know you can achieve.”

Anne Arundel maintains this method today, tracking four indicators: finance, people, growth and service and operations. Scott reports improvements in scheduling, fewer dropped calls, increased productivity, quicker report turnaround time and a more connected workforce.

Riverside’s KPI program has become relatively advanced. Hosenfeld and other IT personnel integrated several software programs into RIS to monitor individual radiologists’ productivity in real-time. Managers track and compare radiologists’ productivity to the baselines of those same radiologists and institutional benchmarks. This has enabled IT to identify the causes of variations in productivity, such as a noisy reading room, poor monitor quality or a particular time of day.

Partnership pays

Virginia Mason Medical Center, a 336-bed hospital located in Seattle, has adopted an institution-wide performance monitoring program, the Virginia Mason Production System (VMPS). The medical center adopted this system after two consecutive years in the red, in 1998 and 1999. In peril of calling it quits after nearly 80 years of patient care, they put together an executive team, with the help of co-Seattle aviation powerhouse Boeing, to study and implement a healthcare equivalent of the Toyota Production System. Virginia Mason implemented VMPS three years later, not only staying afloat but pulling in increasing profit margins even through the latest recession (3.6 percent in 2008 and 5.8 percent in 2009).

Using VMPS, the hospital develops global quality goals, which individual departments and sections actualize through the use of KPIs. Thus, Virginia Mason emphasizes standardization of performance metrics within a flexible system designed by individual departments and sections to meet their own needs and those of the hospital.

Under the hospital’s umbrella goals of patient satisfaction, quality and safety, the radiology department set performance goals for all subspecialties to sign emergent reports within one hour, non-emergent studies within two hours and to ensure all patients have access to imaging studies the same day they are ordered.

Virginia Mason’s radiology KPIs don’t stop there. “We use global, departmental, sectional and individual indicators,” offers Lucy Glenn, MD, chair of radiology. Radiology has its own marks for proper x-ray quality, image labeling and patient health information input. The hospital monitors outcomes from the institutional level down to the performance of individual radiologists and radiographers, maintaining an effective and efficient management system with the help of incentives. (Staff receive annual bonuses up to $500).

Virginia Mason emphasizes that performance improvements are about the patient, but they also save the institution $1 million in supplies each year.

Operational KPIs

Administrators and radiologists alike are well-acquainted with the headaches of scheduling. Extended hold times can have as much influence on a patient’s experience and satisfaction as the quality of care. With patient satisfaction and the bottom line in mind, the S. Mark Taper Foundation Imaging Center at Cedars-Sinai Medical Center in Los Angeles, implemented two KPIs—to cut average telephone hold-time to below 24 seconds and to answer 90 percent of calls within 30 seconds.

The project began by monitoring all agents, hold times, dropped call rates and the time required to schedule different exams. Scheduling supervisor Slava Kroo set up separate phone lines for certain exams, such as mammography and nuclear medicine, and teamed up with registration to collect patient information and payment prior to arrival.

Kroo receives automatic daily reports on all 20 agents and the department as a whole. A stock ticker hangs over a doorway to alert employees to call frequency and wait times. “Making dashboard information available to everyone has encouraged hard work and responsiveness to patient and workload demand,” boasts Kroo, who enjoys being able to reward Cedars-Sinai agents with bonuses for meeting the institution’s KPIs. Kroo reports almost full compliance in scheduling the Taper Center’s 50,000 monthly procedures, keeping average hold time below 24 seconds and answering 90 percent of calls within 30 seconds.

Time for KPIs

Facing escalating financial and competitive pressures, radiology can no longer practice in a data vacuum. Smart business hinges on data-driven planning and goal setting, with KPIs as the cornerstone of the program. The end results speak for themselves: greater patient satisfaction (a competitive advantage), balanced staffing and increased profits.