government
Hospital report cards fall flat at improving patient outcomes
■ As Medicare officials mull which physician quality metrics to make public starting in 2013, gains from the program's Hospital Compare website are deemed modest.
By Kevin B. O’Reilly — Posted March 19, 2012
- WITH THIS STORY:
- » What makes a good public quality-of-care report?
- » External links
- » Related content
Seven years after the federal government started publicly reporting hospitals' performance on quality measures, evidence suggests that this transparency effort has not improved patient outcomes measurably.
The latest discouraging finding is in a study in the March issue of Health Affairs that analyzes death rates among Medicare patients with heart attack, heart failure and pneumonia in the five years before the launch of the government's Hospital Compare website and in the three years afterward. Although individual hospitals' compliance with quality metrics for these conditions was reported publicly, the effort reduced the odds of a heart failure patient dying within 30 days by only 3%. Heart attack and pneumonia patients saw no improvement in death rates, the study concluded.
Researchers adjusted for differences in patient characteristics and used death rates for nonpublicly reported conditions to isolate the impact of Hospital Compare. Mortality rates already were dropping, and public reporting did little to speed progress.
The study comes as officials at the Centers for Medicare & Medicaid Services ponder how to expand the Physician Compare website to include quality data by the January 2013 deadline set by the health system reform law. CMS wants physicians to use the public data to improve their own care, and for patients to gravitate toward higher-quality doctors.
"We should temper our expectations about what programs like public reporting can do to push the needle on quality," said Andrew Ryan, PhD, the study's lead author.
"Maybe on the margins, if these report cards are designed just right, they could make a bit of a difference. In and of themselves, they are probably not game-changers."
CMS did not grant an interview request by this article's deadline.
Few significant gains
Earlier research also has not found that public reporting translates into significant quality gains. A Jan. 15, 2008, Annals of Internal Medicine report reviewed 45 studies on public reporting published since 1986. It discovered that although reports spur hospitals to engage in more quality improvement activities, evidence that they improve care safety or effectiveness is sparse.
"People have really had high hopes that public reporting would be part of the solution to improving the quality of care in this country," said Rachel Werner, MD, PhD, an associate professor of medicine at the University of Pennsylvania Perelman School of Medicine who has published widely on the effect of public reporting, pay-for-performance and other initiatives on quality. The newest study "adds to the body of evidence suggesting that public reporting hasn't really done that much for quality improvement."
A variety of reasons could explain public reporting's minimal effect on patient behavior, some quality experts said. The reports largely have focused on measures of health care processes, such as whether hospital heart-attack patients receive aspirin on arrival, that may seem esoteric or unhelpful to patients. Heart attack, heart failure and pneumonia are acute conditions that do not lend themselves to patients' sitting down to examine carefully how the hospitals in their town compare.
Patients' existing relationships with physicians, and doctors' admission preferences, tend to win out over quality data on a website, said Susan Nedza, MD, adjunct professor of emergency medicine at Northwestern University Feinberg School of Medicine in Chicago.
"If your medical record's at one hospital and your doctor's at that hospital, that is usually enough to keep you from switching," said Dr. Nedza, a former regional chief medical officer in the CMS Chicago office.
The Health Affairs study's use of mortality to judge quality outcomes also may underestimate improvements. "The death rate is a very blunt instrument in terms of measuring quality," said Bruce Bagley, MD, medical director for quality at the American Academy of Family Physicians.
Nancy Foster, vice president for quality and patient safety policy at the American Hospital Assn., agreed.
"It is a mistake to think that one can draw a straight line between any of the process measures and improved performance on a metric like mortality within 30 days of admission to a hospital. Many of the critical steps in care that have been measured and reported on Hospital Compare are not likely to influence patients' mortality within the brief time period of hospitalization," she said. Persuading a heart attack patient to quit smoking, for instance, probably would affect the patient's long-term -- not short-term -- survival.
Looking at the aggregate quality effect of public reporting also may miss subtler effects, said David Dranove, PhD, professor of health industry management at Northwestern University Kellogg School of Management in Evanston, Ill. Lower-performing hospitals have responded to public reporting by improving their performances to the industry average, he said. And although overall quality performance may not change, some patients do get higher quality care by switching to better-scoring facilities.
Seeking effective transparency
The examination of Hospital Compare was one of several reports in the March issue of Health Affairs to analyze the effect of system reforms on quality. A study of voluntary public reporting in Wisconsin found that participating physician groups were more likely to implement diabetes care quality interventions, but the study did not report improvements in patient outcomes.
Another study found that -- contrary to the hope that health information technology would slash unnecessary testing -- physicians with access to patients' prior imaging results were 40% to 70% likelier to order those tests again. But that study used data from 2008, before CMS linked incentive pay to health IT use. The study also did not examine whether the imaging tests were clinically appropriate, critics noted.
Finally, a study concluded that patients presented with physician cost and quality data may simply choose doctors who order more services.
"I was concerned that people would use the cost information as a proxy for quality," said Judith H. Hibbard, MPH, DrPH, the study's lead author. Patients did so, perceiving the higher dollar amounts that were placed next to a physician's name as a signal of quality.
That study highlights the importance of properly designing quality reports so that patients can understand and act on them. The federal government and others need to do a better job at that, said Lisa McGiffert, director of the Consumers Union's Safe Patient Project. "Fundamentally, the quality data should be out there. People have a right to know. But just putting a bunch of numbers out doesn't satisfy that right. ... It is confusing for consumers," she said.
Public reports that mislead is one of the American Medical Association's principal concerns about the data that may be published in Physician Compare. The AMA said the metrics chosen should be based on accurate information, developed by organizations such as the AMA-convened Physician Consortium for Performance Improvement, and never based solely on care utilization or cost.
The AMA wants physicians to be given enough time to review information and correct any inaccuracies before data are posted publicly. CMS plans to propose 2013 metrics for Physician Compare this summer.












