Hospital comparisons: behind the numbers

If you have a heart attack or pneumonia, does it matter which hospital you go to? Yes, it does, according to comparative data collected by the Centers for Medicare and Medicaid Services, the federal agency that runs the Medicare and Medicaid programs.

All hospitals do not appear to be created equal, at least when you look at statistics that measure important benchmarks such as whether the appropriate drugs were prescribed, how many patients were readmitted to the hospital within 30 days, and how many patients died. The numbers are publicly reported, hospital by hospital, at the U.S. Department of Health and Human Services’ Hospital Compare website (and summarized here for hospitals in Minnesota; the Hospital Compare website seems to be experiencing some technical difficulties and is not currently available).

This database, and what the numbers mean for patient care, was the subject of an in-depth story this past weekend in the Star Tribune of Minneapolis. From the story:

The two small-town hospitals could hardly be more alike. Just 20 miles apart in southern Minnesota, they’re both run by the Mayo Health System and even share some of the same doctors.

Yet in Albert Lea, patients hospitalized with heart failure are twice as likely to die as those in neighboring Austin, government data show.

That kind of gap may seem improbable, especially in a state known for first-rate medical care. But new ratings published by the federal government have found startling disparities in hospital performance all across Minnesota.

Variation has always been an issue in health care. No two organizations, and even no two individual providers, give care in exactly the same way. In fact, you can make a good case for some degree of variation, because no two patients, and their needs, are going to be exactly alike. Consistently falling well outside the norm, however, is a clear signal that something is amiss.

For this reason, it’s helpful to measure these things. Providers might have an idea where they stand on quality measures, but unless you actually benchmark yourself and compare yourself to your peers, it’s all guesswork. By publicly reporting the results, there’s also an element of accountability, or at least transparency, with the public.

As with any set of statistics, of course, some caution is needed with the interpretation. The data tracked at Hospital Compare, known as core measures, are very specific: If a patient was admitted with a heart attack, did he or she receive aspirin upon arrival? Were aspirin and a beta blocker prescribed when the patient was discharged? Did the patient with pneumonia receive an influenza vaccination? What it doesn’t tell you is whether the patient filled the prescription after leaving the hospital and took the drugs as prescribed. It doesn’t tell you if the pneumonia patient was offered a flu shot and said no. For that matter, it only captures what’s actually documented in the patient’s chart. If smoking cessation was indeed addressed with the pneumonia patient but someone didn’t check the right box, for the purposes of Hospital Compare it doesn’t count.

Demographic characteristics of the community can make a difference in the numbers. It’s rather revealing to see that Hennepin County Medical Center in Minneapolis didn’t fare as well on the mortality rate among heart attack patients over age 65. Does the fact that HCMC is the state’s safety net for the poor and homeless have something to do with this? It should also be noted that percentages can be skewed for smaller hospitals that may see only a handful of patient admissions for acute heart attack or congestive heart failure. In these cases, omitting or not documenting just one core measure can drag down the entire score, sometimes significantly.

It’s important to remember too that the hospital stay is only one point on the health care continuum. If a patient ends up being readmitted within 30 days, poor hospital care might not be the only factor to blame. A crowded schedule at the primary care doctor’s office might mean patients have to wait several days before seeing their doctor for a followup visit. When followup visits aren’t timely, issues with new prescription drugs or emerging complications can’t be addressed soon enough, and patients end up being at risk of a hospital readmission.

That said, it’s enormously helpful to hospitals to know where they stand on these core measures. In fact, the Hospital Compare data is probably far more valuable to hospitals than it is to the public. From the Star Tribune article:

The ratings don’t quite explain why some hospitals perform better than others, and there’s little evidence that consumers use them to shop around for care.

But hospitals are paying attention.

“If I’m running a hospital, I want it to be as good as my neighbors,” said Dr. Gordon Mosser, an expert in quality measurement at the University of Minnesota School of Public Health. “Once the numbers start getting reported, they start caring a lot.”

I’d like to think that if a hospital finds itself out of the norm month after month, it would trigger some soul-searching and a lot of unflinching questions about why this is so – and an effort to change. Likewise, if a hospital consistently does well on the core measures, it’s a powerful reinforcement that good processes are in place and that the staff is delivering high-quality care comparable to the best of their peers.

Over the years, I’ve met very few health care professionals who were OK with being “pretty good” or with working for an organization that’s “pretty good.” The vast majority of them want to be way better than this. If benchmarking and comparisons help spur them toward excellence, it’s a benefit to hospitals and it’s a benefit to patients.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>