HA: Theres no I.O.U. in teamreimbursements spur quality measurement

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon
money handshake - 39.33 Kb

Oncology providers and payors can be brought together in a large, statewide consortium to effectively measure quality and improve care, so long as the cost burden of quality improvement initiatives are borne by payors in the group, according to two articles published in the April issue of Health Affairs.

The first article, written by Douglas W. Blayney, MD, of the Stanford University Cancer Institute in Stanford, Calif., and colleagues broadly described the creation of the Michigan Oncology Quality Consortium and the early results of this collaboration. A related article provided a more focused look at one practice’s experiences in the consortium.

The Michigan Oncology Quality Consortium is a collaboration between medical oncology practices in the state, Blue Cross Blue Shield of Michigan, the American Society of Clinical Oncology and a coordinating center at the University of Michigan in Ann Arbor. To facilitate self-assessment, quality measurement and care improvement, each member of the consortium participates in the Quality Oncology Practice Initiative developed by the American Society of Clinical Oncology.

The Quality Oncology Practice Initiative uses evidence- and guideline-based indicators to measure performance, and is functional for a geographically dispersed cancer care system.

Overall, the authors found practices had an 85 percent rate of adherence to quality care processes for breast and colorectal cancer care. Adherence was lower for end-of-life care processes, where the rate was 73 percent. The lowest adherence rate, at 56 percent, was for symptom and toxicity management care processes.

Interventions were developed to improve adherence to treatment and pain management guidelines, as well as incorporate palliative care into oncology practice.

Among the specific lessons learned by the authors studying the consortium was the fact that simply collecting data on quality process adherence, which is sometimes enough by itself to improve performance thanks to a phenomenon known as the Hawthorne effect, was not enough to measurably affect quality.

“In a large, single-institution physician practice, we had previously observed the operation of the Hawthorne effect on oncologists’ decisions about administering chemotherapy in the last two weeks of life. However, in our multiple practice consortium, relying solely on the Hawthorne effect was not sufficient to produce improvement on most quality measures,” wrote the authors.

Another finding described by Blayney and colleagues was that practice participation in the consortium greatly increased once Blue Cross Blue Shield of Michigan began reimbursing for data entry costs. Since community-based practices often can’t afford to cover unreimbursed costs of quality improvement activities, the authors suggested that “reimbursement specifically for quality improvement activities will increase providers’ participation in those activities.”

An accompanying article described the oncology practice of Tallat Mahmood, MD, and her two colleagues with locations in Lansing and Owosso, Mich. The practice joined the consortium to get help measuring the quality of the care it provided and take advantage of the reimbursements for quality improvement activities offered to member practices.

“Before joining the quality consortium, Mahmood’s practice did not have easy access to the sort of detailed records that would enable its physicians to verify their level of performance day to day or year to year,” wrote the authors. “The absence of consistent record keeping meant they couldn’t achieve verifiable consistency of performance over time.”

Mahmood said she and her colleagues have more confidence in the quality of the care they provide since joining the consortium, even if it’s too early for data to demonstrate any conclusive outcomes.

One drawback is the mountain of extra data entry work required of the practice. Ten to fifteen forms must be entered into the clinic’s database each day, and every six months, data for approximately 80 quality measures must be entered into a database with the American Society of Clinical Oncology, a process which takes about 140 hours.

“Still, the clinic’s doctors and staff have been willing to shoulder the extra work, even at the cost of working nights and some weekends,” wrote the authors. “The return is an assurance that a system is in place to deliver high-quality care to every patient, every day.”