NEJM: Putting 'effect' into comparative-effectiveness research
With the federal government putting $1.1 billion into comparative-effectiveness research, two Baylor College of Medicine scientists advocate investing in research that puts science into practice in doctors' offices and clinics across the U.S., according to a perspective in the May 7 issue of the New England Journal of Medicine.

"We need to pay as much attention as to how the evidence is put into practice as to the evidence itself," said Laura A. Petersen, MD, chief of the section of health services research in the department of medicine at Baylor and director of the Veterans Affairs (VA) Health Services Research and Development Center of Excellence in Houston.

Petersen and Aanand D. Naik, MD, an investigator in the VA Center of Excellence, discussed the need for a new emphasis in implementing the results of comparative-effectiveness research in the perspective.

"How do you get evidence into practice?" Peterson said. "We need to study that with the same intensity as we go about getting the evidence."

Comparative-effectiveness research means scientifically evaluating drugs, medical devices, surgical procedures and other treatments to determine which provides the highest quality at a reasonable price, according to the authors. As the Obama Administration seeks to cut the costs of healthcare without affecting quality, the emphasis on comparative-effectiveness research has intensified.

"Policymakers and research funders such as the National Institutes of Health (NIH), often assume that the final steps in the translation of clinical research--the decision to act on new medical evidence and its implementation in routine care--are seamless and automatic, whereas we know that changing the behavior of physicians and patients is difficult," they wrote. "The need for comparative-effectiveness research is clear, many of the assumptions regarding the most important aspect of such research--the ultimate implementation of its findings into healthcare--have little empirical support."

"We do great research, and it ends up on the shelf. It takes an average of 17 years for research to get into practice," Petersen said.

"We expect doctors and healthcare providers to know what is the right thing and then to put it into action. We know that is not happening," she said.

In the perspective, she and Naik described how studies in the early 1990s showed that using a balloon technique to open clogged coronary arteries in the heart after a heart attack worked better than treatment with clot-busting drugs. Yet, 10 years after the first publication of these studies, less than one-third of hospitals were providing the balloon technique within the time period needed for high quality care.

The authors said that experts are now developing ways to put the life-saving treatment into effective practice more widely among the U.S. hospitals.
Trimed Popup
Trimed Popup