Out with the old

Purging the bad stuff of 2012 so the new year can start with a clean slate has become something of a tradition in New York City’s Times Square, where the Times Square Alliance hosted its sixth annual Good Riddance Day last week.

A Dumpster and giant shredder were hauled in so people could trash all their worst memories – tax records, insurance forms and painful reminders of Hurricane Sandy, unemployment and cheating ex-boyfriends.

Catharsis is supposed to be a good thing, so here’s some health-related baggage we’d like to see consigned to the shredder before the year is over:

– The politicizing of health.

– Public messages that hype or oversimplify health issues.

– Health scare tactics.

– The demonizing of obesity.

– The urge to pathologize a condition in order to sell a product or service (“You aren’t coming down with a cold; you have a pre-cold!”)

– Cluelessness by health organizations about the patient experience.

– Automated phone systems that strand patients in voice-mail no-man’s-land.

– Pill bottle caps that are complicated and difficult to open.

– The scale at the doctor’s office.

Into the Dumpster with all of them! Here’s to a fresh start in 2013.

How sweet it isn’t?

First it was fats, then carbohydrates. Now sugar has joined the ranks of nutritional villainy.

With Christmas approaching on a tidal wave of candy canes and gingerbread, one can’t help wondering: Is it OK to indulge in a little sweetness, or is sugar entirely bad?

There’s no denying a certain amount of hysteria when it comes to sugar. Critics claim sugar causes everything from hyperactivity to premature aging. A common – and inaccurate – belief about cancer is that cancer cells feed on sugar.

Some of this is hyperbole but there’s also a considerable amount of science that has examined the effects, both good and bad, of sugar consumption. Sugar has been linked, for instance, to increased risk of weight gain, diabetes and heart disease. Earlier this year, some U.S. health experts went so far as to declare that sugar is as addictive and dangerous as alcohol or tobacco and should be regulated accordingly.

Unfortunately it’s not always clear whether sugar itself is the culprit or whether something more complex is going on.

At least part of the reason why higher sugar consumption is linked to weight gain may simply be the extra calories. One of the issues with sugar-sweetened sodas and other beverages isn’t just that they contain lots of sugar, it’s that they often end up replacing water or milk in someone’s diet. Processed foods high in sugar also can be higher in fat and sodium, which are associated with negative health effects of their own.

A certain amount of sugar is necessary in order for the human body to function, but moderation seems to be called for here. Consuming large quantities of sugar each day also tends to be a marker for an overall diet that may not be optimal for health.

This brings us to the real vexation: the proverbial sweet tooth. Why do so many people love sugar and why can it be so hard to consume less of it?

I admit to not totally understanding the whole sweet tooth thing. If you were to invite me to your holiday buffet, I would go directly to the spinach dip and the shrimp cocktail. Cookies and candy, not so much. But this wouldn’t necessarily be the case for other guests.

There’s debate about whether so-called sugar addiction is real or imagined. Some studies have found clinical similarities between food cravings and drug dependence. A study published this year in the Journal of Psychoactive Drugs found, for instance, that when people binge on sugar-dense foods, it increases the amount of extracellular dopamine in their brain which has the potential to lead to addiction.

The authors wrote, “There appear to be several biological and psychological similarities between food addiction and drug dependence, including craving and loss of control.” They also note that for some people, consuming these foods is comforting and therefore might be regarded as a form of self-medication.

So far, however, the sugar addiction theory has mostly been tested on rats and mice, with implications for human behavior that are unclear at best. A 2010 review in the Clinical Nutrition journal examined the evidence and concluded there’s nothing yet in the literature suggesting that humans can become addicted to sugar or that sugar addiction plays a role in obesity or eating disorders.

The bottom line is that when it comes to “sugar addiction,” the jury still seems to be out.

In the meantime, here’s some guidance. The U.S. Department of Agriculture’s 2010 dietary guidelines recommend limiting consumption of added sugars and solid fats to somewhere between 5 and 15 percent of total daily calories. The American Heart Association suggests no more than 100 calories a day of added sugar for most women and no more than 150 calories a day for most men. That’s about 6 teaspoons for women and 9 for men.

Here’s something else to keep in mind. For most Americans, the main source of the sugar they consume isn’t in that spoonful they dump into their coffee, or a homemade dessert or even a Christmas cookie. Most of our dietary sugar comes in the form of added sugar – sugars and sugar-based syrups that are added to food during processing. Although the tendency is to single out highly sugared products, such as sodas, as the problem, added sugar can show up in a variety that may not be readily recognized – chicken nuggets, for instance, or ketchup or children’s breakfast cereals, all of which are often surprisingly high in sugar.

So go ahead and have that reindeer-shaped holiday cookie if you want. If you’re worried about the sugar, just take one.

The looming shortage of doctors

Not too long ago, the American Association of Medical Colleges unveiled a new print ad depicting a patient sitting alone and distressed in an exam room. The stark message: “By the time you notice America’s doctor shortage, it will be too late.”

A new round of ads released this month warns, “Careful what you cut.”

What many rural communities have known for years is increasingly catching up with everyone else: The supply of doctors won’t be enough to meet future demand.

The AAMC’s workforce estimates aren’t encouraging. By 2020, the U.S. is projected to have a shortfall of 90,000 doctors, according to data collected and analyzed by the AAMC Center for Workforce Studies.

Part of the shortage is on the demand side. As the baby-boom generation ages, there will be a dramatic increase in the number of Americans older than 65 – precisely the population that tends to use health care services the most. Easier access to health insurance, the result of the federal health care reform law, also is expected to bring millions of formerly uninsured patients into the system, many of them with pent-up health needs that will need addressing.

But the absolute number of doctors is also anticipated to decline in future years. Although at least half of the projected shortage is among primary care doctors, the other half will be among specialties not customarily thought of as being in short supply – surgeons, oncologists, endocrinologists and more.

In practical terms, what it means for patients is the likelihood of longer waits in the future to see a doctor, more difficulty obtaining timely appointments, and possibly delays in care that could have long-term health consequences.

This is obviously a vastly oversimplified picture of what’s happening in the U.S. physician workforce right now. It doesn’t even touch on the many other issues churning alongside the basic math of supply vs. demand – the number of physicians who will be retiring in the next decade, for instance, or the tendency for physicians to gravitate toward non-rural practice, or the soaring cost of a medical education, or the staggering educational debt that students accumulate and its impact on their choice of specialty.

There’s another factor, though, that much of the public may be less aware of: Federal funding for residency training, which all physicians go through after completing four years of medical school, has not increased to meet current demands. The result is a bottleneck that has reduced U.S. capacity to get physicians fully trained and into the workforce.

Physicians – and indeed all the health professions – have unique training needs. Although much of their initial learning takes place in the classroom, at some point they must be unleashed on actual, live patients to hone their skills. Without this hands-on experience, there’s no other way to become familiar with health and illness and the variety of ways these manifest themselves across the spectrum of patients. There’s no other way to become proficient at diagnosing, treating, prescribing and performing procedures.

It takes time, money and resources to provide the necessary programs and supervision at teaching hospitals where residency training takes place – which is why outside funding is so critical.

The Association of American Medical Colleges has ramped up its lobbying effort with Congress this year to eliminate a freeze on Medicare funding for residency training. The freeze has been in place since 1997 – a full 15 years. Even a relatively modest 15 percent increase would be enough for teaching hospitals to prepare 4,000 additional doctors per year, the AAMC argues. The AAMC estimates that 10,000 more doctors need to be trained annually to completely address the pending shortage.

Certainly there are other sources of residency funding – state governments, private businesses and the teaching hospitals themselves, to name a few. The number of residents in training in the U.S. in fact has grown despite the Medicare cap. State governments and cash-strapped hospitals may not be able to sustain this indefinitely, however, nor does a fragmented approach necessarily ensure that the right types and amounts of primary care doctors and specialists are being trained.

Redesigning the health care delivery system to make it more team-centered could help blunt some of the growing shortfall in the physician workforce, yet this is unlikely to be the total answer. Patient care will still demand the skills and training of a doctor.

Considering the success that physician practices here in rural central Minnesota have had in hiring new doctors the last couple of years, it may seem there’s no real urgency to address a looming shortage of doctors.

But what’s critical to keep in mind is that the pipeline from medical student to full-fledged physician is long – seven years at a minimum, and usually longer for subspecialists. Is it OK to delay action until the problem becomes more painfully evident? The AAMC says no: “The United States cannot afford to wait until the physician shortage takes full effect because by then, it will be too late.”

Too stoic for antidepressants

Should people who are depressed take antidepressant medication, or should they just tough it out?

There’s often a stigma surrounding the use of antidepressants, and it may be preventing people from getting the treatment they need, college student Leah Lancaster wrote this week in an insightful opinion piece for the Minnesota Daily.

Lancaster writes that she has been taking antidepressants since she was 15 years old – and that without them, she most likely would not have gone to college. “Yet, when the topic comes up, I often find myself defending my decision against accusations that I’m ‘numbing myself’ or ‘taking the easy way out,'” she writes. “Supposedly, if I did yoga, ate healthier and took a more ‘natural’ approach, I wouldn’t need to contaminate my mind and body with toxic pills.”

Some of the stigma surrounding depression itself seems to have eased in the last couple of decades. But when it comes to antidepressants, it can still be hard for the public to accept that for many people, medication may be necessary to help them feel better.

It’s hard to measure how widespread this attitude might be. It clearly exists, however, and one of the consequences is untreated depression. A study that appeared last year in the Annals of Family Medicine found that patients often don’t tell their primary care doctor that they’re experiencing depression. The No. 1 reason for this lack of disclosure? They feared being prescribed an antidepressant.

Even in the medical setting, patients often are reluctant to report that they take prescription medication for depression or anxiety, writes Mag Inzire, a physician assistant at a community hospital in New York.

The patients she encounters rarely worry about disclosing a history of diabetes or high blood pressure, she wrote. “Yet when it comes to depression or anxiety, there is some uncertainty in their response. And it always seems to follow by some long, drawn-out explanation as if to justify the diagnosis.”

Depression in fact is relatively common. In any given year, about 6.5 percent of the
American population will experience depression. Across a lifetime, about 16.2 percent of the population will have depression at some point. Stigma or not, antidepressants are one of the most frequently prescribed drug categories in the United States.

How antidepressants work in the brain, and whether they’re truly effective, is a matter for some debate. At one time it was thought that low levels of serotonin, a mood-enhancing chemical, were a trigger for depression, and that drugs such as Prozac, which raise the level of serotonin in the brain, would correct this. This theory has been called into question, though, and if continuing neuroscience study is any indication, the role of antidepressants is considerably more complex than this.

Why, for instance, does medication seem to be more effective for severe depression but less so for mild or moderate depression? Why do some antidepressant medications cause a worsening of depression in some people?

Studies have found that people with mild depression often do well with talk therapy alone. Other studies have found that a combination of medication and talk therapy is often most effective for mild to moderate depression. What does this mean for the role of talk therapy in treating some forms of depression?

Of the millions of antidepressant pills dispensed in the U.S. each year, some likely have been overprescribed to those who don’t really need them. “The reality is that many psychiatrists do give out pills too freely, and many patients start taking medications without properly researching them beforehand,” Lancaster writes.

But in her own case, antidepressant medication has made the difference between being able to function vs. withdrawing from life, she wrote.

Medication hasn’t been a cure for her. “No pills can do that,” she wrote. “What they can do is give you some energy and focus so you can make it through the day without feeling lethargic, irritable or just downright horrible.”

And she notes a double standard, at least in college-campus culture, of peers who view binge drinking, smoking, unprotected sex and “study drugs” as socially acceptable but believe antidepressants are “dangerous and mind numbing.”

“Like any medicine, antidepressants aren’t perfect,” she wrote. “But to make the sweeping generalization that all of them are bad is dangerous and prevents many from getting the help they need.”

Does online patient access = better care?

When patients have online access to their medical record and the ability to email their doctor, does it lead to better care?

In theory, patient portals are supposed to improve communication between patients and clinicians and encourage patients to become more engaged in their care, thereby producing better outcomes. There has also been a belief that if patients can view test results online and have non-urgent concerns resolved via email, it would help cut down on in-person use of health care services and allow the system to function more efficiently.

But a new study, carried out by Kaiser Permanente and published last month in the Journal of the American Medical Association, has found that the case for efficiency might be wishful thinking.

Previous studies that have examined patient use of health information technology have mostly been small. This study was a large one, involving about 44,000 Kaiser Colorado members who had online access to their medical record and 44,000 who did not. Both groups were followed before and after the introduction of an online patient portal.

The results were surprising. Contrary to what the researchers expected, patient use of the portal was associated with more, rather than fewer, office visits and telephone calls. The researchers found an 8 percent increase in the volume of phone calls from these patients and a 16 percent increase in office visits.

Patients who had access to the online portal also made more visits to the clinic after hours, went to the emergency room more often and were hospitalized more often than those who weren’t signed up for the portal.

Is there a connection between online access to medical information and care-seeking behavior by patients? The study wasn’t designed to explore this, although Dr. Ted Palen, the lead author, speculated in an accompanying audio interview that patients who anticipated needing more care may have been more likely to sign up for the portal in the first place.

It’s an intriguing study because there’s still much that isn’t clear about how health information technology is shaping patient behavior and the impact this has on the delivery of health care. Which patients are more likely to use online access to their doctor and their medical record? Do patient portals foster more engagement? Are outcomes better for these patients? When a medical practice decides to offer an online patient portal, does it promote more efficient communication or does it create an extra burden?

This last point is important. An 8 percent increase in phone calls or a 16 percent increase in office visits might not sound like much, but when it’s applied across a system  with thousands of patients, it can add up to a significant impact, Dr. Palen points out.

Health systems considering the use of patient portals need to ask themselves whether they have the capacity to absorb a potential increase in utilization, he said. “You’d better plan for that.”

The findings from the study could further dampen the enthusiasm for online patient portals, which the health care system has been slow to adopt anyway. What we don’t know, however, is whether the additional office visits and phone calls were actually beneficial in some way to patient health, or whether there was a fundamental difference between the patients who signed up for the portal and those who didn’t.

The increased utilization may in fact have been “a good thing” if it led to better outcomes and health status in the long run, Dr. Palen said. But it seems far more study and analysis are needed to truly sort out the impact of patient-centered information technology and how to use it wisely, appropriately and effectively.

Grief for the holidays

It’s hard to look at the calendar and not be reminded that Christmas Eve will mark exactly four months since my dad’s funeral. There’s going to be an enormous gap in the family holiday celebration this year, and in fact every year from now on.

But for what it’s worth, we are far from alone in having grief as an uninvited guest for the holidays.

Although the cultural expectation is that this is supposed to be a joyful time of the year, the reality is otherwise for anyone dealing with death, illness, financial difficulties, divorce, homelessness, or other forms of loss.

We shouldn’t need to be reminded of this, but somehow we often do anyway. And it seems many of us need outside advice on how to cope – or, for those who aren’t anticipating that their own holidays might be difficult, advice on how to be sensitive toward family and friends who are.

My email inbox has been filling up since October with suggestions on everything from getting through the holidays while undergoing cancer treatment to coping after a natural disaster. A half-hour on the Internet  turned up even more advice and insight, much of it from experts on grief.

If there’s one message to be gleaned from all this information, it would perhaps be this: Expect your emotions to be near the surface and expect that it will be hard at times, but concentrate on how you can make the holidays both manageable and meaningful in spite of what you’re dealing with.

Caroline Flohr, who lives in suburban Seattle and recently published “Heaven’s Child,” a memoir about the sudden death of her 16-year-old daughter, Sarah, has this to say: “Through the web of pain, I have been amazed by the power of family, love and faith in  healing.”

Have faith in your own inner strength and be appreciative of what you have, she writes.

From a grief counselor: Try to avoid comparing your situation with that of other people who are together and enjoying the holidays; no family gathering is perfect or stress-free.

Alan Wolfelt, the founder of the Center for Loss and Life Transition in Fort Collins, Colo., and a noted author and counselor, suggests that rather than allowing well-meaning friends and family to prescribe how they think you should spend the holiday, focus instead on what would be meaningful to you.

What about the thousands of people for whom health challenges will be an unavoidable part of the holidays? Deborah Cornwall, a leadership volunteer for the American Cancer Society and author of a new book, “Things I Wish I’d Known: Cancer Caregivers Speak Out,” sums it up this way: “Keep it festive. Keep it simple. Keep it social. Keep it positive.”

Having cancer or being a caregiver for someone with cancer (or any other major or chronic disease, for that matter) is often overwhelming, so look for normalcy, she advises. This might mean focusing on a few traditional activities, such as baking and decorating cookies, that are most important to you and skipping the rest. Make togetherness the priority – and find time to laugh, Cornwall suggests.

Those who haven’t yet experienced grief or illness or hardship during the holidays may want to be helpful but don’t know what to say or do.

Again, the experts come to the rescue with some important tips: Don’t judge. Don’t give advice that hasn’t been asked for. Be present and listen. Rather than waiting to be asked or making vague offers of help, take the initiative and offer to help in ways that are specific and practical, such as bringing over dinner or shoveling snow off the sidewalk.

In the days and weeks after Dad died, it was often the little things that mattered most – the cards, the phone calls, the neighbors who brought food, the people who took the time to share their memories of him.

Studies on coping with grief and adversity mostly point to the same conclusion: Support from other people matters, and an essential part of the recovery process is the construction of meaning out of loss. Even though the holidays are often a serious test of people’s emotional fortitude, at the same time it can be an opportunity for the sick, the struggling and the bereaved to become more resilient.

Clinician, wash thy hands: When the patient becomes the enforcer

As an intensive care physician at a hospital in Saskatoon, Dr. Susan Shaw is both comfortable and confident in the health care setting. But when she recently had to bring her daughter to the emergency room with a broken arm, “I couldn’t do it,” she blogged recently. “I couldn’t ask the nurses and doctor who looked after my daughter to wash their hands.”

If she was uncomfortable speaking up, what about the average patient and family?

Dr. Shaw’s experience illustrates an important and often overlooked issue in patient safety: the gap between what patients and families are told to do – “Remind providers to wash their hands” – and their willingness and/or ability to actually do so.

Hand washing is one of the key ways clinicians can avoid spreading germs to their patients and reduce the risk of infections acquired in the hospital setting. Yet health care workers often are inconsistent at washing their hands. In various studies that have attempted to measure hand washing compliance, the rate at some hospitals has been estimated at an abysmal 25 to 30 percent. Some studies have measured especially low rates among doctors and medical students.

No wonder, then, that patients and families have been enlisted in the campaign to help clinicians do better. But is the public any more successful in the role of enforcer than hospitals, medical clinics and nursing homes have been? It seems the answer is no.

A study that appears this month in the Infection Control and Hospital Epidemiology journal queried 200 patients about their awareness of the importance of hand washing and whether they would feel comfortable reminding nurses and doctors to wash their hands. The patients, all of whom were considered at higher risk for infections such as methicillin-resistant staph, were highly aware of the need for hand washing – but only 14 percent said they had ever spoken up and asked a health care worker to wash their hands. Moreover, only about half said they wouldn’t feel awkward asking a doctor to wash his or her hands.

Why such a gap between theory and practice? Based on her own experience, Dr. Shaw suggests some reasons:

I was worried about making the doctor and nurses feel uncomfortable. I knew my daughter’s care wouldn’t be compromised if I upset the doctor and nurses. I just didn’t want any awkward feelings.

Readers weighed in with similar stories of being too uncomfortable to ask – or of asking and being met with a grudging response. One person wrote, “When I did ask a nurse if she had washed her hands, I was greeted with a look that said to me I had been labeled a ‘difficult family member.'”

Indeed, more than a few health care professionals believe it’s not the patient’s or family’s place to remind them to wash their hands, as a study published a few months ago in the Archives of Internal Medicine concluded. Nearly one-third of the doctors and nurses who were surveyed said it was inappropriate for patients and families to have a role in this, and almost two out of five said they wouldn’t wear a button or sticker urging patients to ask them if they washed their hands. The reasons they gave: 43 percent said a reminder from a patient or family member would make them feel guilty, 27 percent said it would be humiliating and about 25 percent said having to stop and wash their hands in response to prompting by the patient would take too much time.

It all begs the question: Is it effective or even realistic to expect patients to help enforce hand washing compliance?

Patient and family participation is difficult to accomplish when the culture of a health care organization doesn’t openly encourage and support it, Dr. Shaw wrote. “I think it comes down to creating a new norm where healthcare workers clearly give permission to patients to expect and demand hand hygiene be part of care each and every time. This means focusing on the culture, behaviours  and attitudes of us as healthcare providers.”

There’s an even larger issue here, though, and it’s this: Should hand hygiene be a shared responsibility between patients and clinicians, or does it rest first and foremost with the health care professionals?

Dr. Shaw reflects: “What I learned from my experience is that is the responsibility of the care provider and the healthcare system to do the right thing every time for every patient without expecting patients to be the inspectors of our work.”

Perhaps we shouldn’t be asking how to make it easier for patients and families to speak up and remind clinicians to wash their hands. Perhaps we should be asking instead how to get clinicians to wash their hands every time without needing to be reminded by patients.

A blogiversary, and lessons learned

It dawned on me this weekend that as of Dec. 1, this blog has been around for four years.

That’s hardly a milestone occasion like a five-year or 10-year anniversary would be. But in the here-today-gone-tomorrow world of blogging, a blog that’s been around for four years is… well, relatively ancient.

I did a quickie Google search some research and came up with this tidbit about blog longevity: Worldwide, 60 to 80 percent of blogs are either abandoned or updated infrequently within a month of being created. According to the folks at Squawkbox, “This means that the average lifespan of a blog is equal to that of the common fruitfly.”

So a fourth birthday is a pretty good deal, I guess.

Four musings on the occasion, one for each year, in no particular order of importance:

– Health is universal. It affects people’s lives in many ways every single day, from how they communicate with their doctor and whether they take their pills to what they eat, how well they sleep and how they cope with stress. Health is often scholarly and academic but it also intersects with popular culture. There’s never a shortage of topics to blog about, and it’s always interesting to see how the issues resonate with readers.

– Blogging is demanding. Fresh content is what drives traffic to a blog and keeps readers coming back, but it’s a constant challenge to feed the beast. Most bloggers will tell you that sustaining the mental and creative energy for it, along with the time commitment, isn’t easy. I used to blog an average of three times a week; now I’m down to twice a week which seems somewhat more manageable.

– Audience reach is surprising. In the beginning, most of the readers here were local. Now they come from all over, and local readers are a minority. (Hello out there, everyone!) Belonging to the Forum Communications Co.’s Area Voices online community has been a tremendous help with visibility. Most bloggers like some reassurance they’re not just talking to themselves. Area Voices has increased this blog’s exposure in ways I could never have accomplished alone.

– Blogging is rewarding. When you have a passion for a topic, such as health, blogging about it isn’t work; it’s satisfying and often fun. It’s especially rewarding when readers leave comments sharing their own thoughts and insight and experiences. The conversation becomes better, both for readers and for me.

Now let’s blow out the candles and have some cake!