Towards better hospital care

Right on the heels of the Minnesota Department of Health’s annual report earlier this month on adverse events, the Joint Commission released its own annual report on hospital quality and safety in 2009.

The report tracks improvement over seven years, 2002 through 2008, on 31 quality measures that reflect the current best evidence on effective care for pneumonia, heart attacks, strokes, surgery and other issues that are among the most common reasons for hospital admissions.

Among the findings: There was sustained improvement on 12 of the measures and more limited improvement on 13. In two areas, both in the treatment of pneumonia, hospital performance slipped between 2007 and 2008. Data were submitted by 3,000 hospitals accredited by the Joint Commission, the main accrediting body for hospitals in the United States.

It’s interesting to follow the progress from one year to the next. For instance, on one of the measures for heart attack care back in 2002, these patients were advised 66.6 percent of the time to stop smoking. By 2008, this had risen to 98.9 percent.

These days there’s a lot of data being generated on hospital performance. The Minnesota Department of Health report, for instance, tracks “never” events, which by definition are both rare and serious. The Joint Commission’s quality measures, on the other hand, focus on common, manageable issues shown to help reduce the likelihood of complications that can lead to morbidity, longer hospital stays and higher costs. Routinely giving antibiotics before surgery, for instance, demonstrably cuts down on the risk that the patient will develop a surgery-related infection.

Want to find out how a specific hospital fared on the Joint Commission’s most recent quality report? You can search the online database at Quality Check, a tool developed by the Joint Commission and see detailed results.

There has historically been considerable variation in how hospitals have performed on these measures. In some cases, there wasn’t a pathway to help ensure that best practices were consistently reaching every patient. Sometimes the correct steps were taken but they weren’t documented in the medical record. And sometimes there may have been valid clinical reasons why a particular best-practice measure wasn’t followed for an individual patient – a drug allergy, for instance, or other condition or circumstance that may have made a different course of action the wiser choice.

Quality measures aren’t the be-all and end-all of hospital care. Some things, frankly, are hard to measure objectively. Advising patients who’ve been hospitalized with a heart attack to quit smoking might earn the hospital a checkmark in the appropriate box but it doesn’t necessarily mean the patient actually followed through. Some of the measures that track whether the appropriate drugs were prescribed – an ACE inhibitor and beta blocker for heart attack patients, for example – don’t really address questions of dosage or proper follow-up for the patient, which can be equally important in avoiding complications and ensuring a good outcome. There also should be room for some clinical judgment. What’s good for 95 percent of patients may not be best for the other 5 percent.

Overall, though, performance measurement can be an enormously helpful tool for hospitals to track how well they’re doing and benchmarking themselves against other hospitals. After all, you don’t truly know if you’re good, bad or ugly unless you have actual numbers; otherwise you’re just guessing. And you can’t genuinely improve and sustain your forward momentum for the benefit of patients unless you have a reliable way of measuring your progress.

West Central Tribune file photo by Bill Zimmer

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>