Flipping the clinic

Advocates of patient-centered care talk of “flipping the clinic” to create an environment in which everything, from the sign-in process to how patient-provider conversations are conducted, is designed with the patient experience foremost in mind.

So here’s a really radical thought. What would happen if we took this one step farther and flipped the flip? What if patients also became more mindful of the provider experience and, by extension, more supportive and understanding of their doctor, their nurse and others involved in their care? What might we gain as a result?

It’s a novel idea raised  by Natasha Gajewski, a patient advocate and innovator who recently blogged about attending a Flip the Clinic symposium hosted by the Robert Wood Johnson Foundation and hearing some of the success stories about patients who were empowered to become full participants in their care. Seeing how rewarding it was for clinicians to help make this happen, Gajewski reflects:

… [J]oy and satisfaction are apparently in short supply amongst care providers, particularly those on the front lines. So I find it curious we focus so much attention on salvaging the wellbeing of the patient, when studies and the emerging crisis in primary care suggest that more attention needs to be given to improving the wellbeing of clinicians.

Then she asks the million-dollar question: “Could patients cure clinician burnout and other problems in our healthcare system?”

In one of those moments of synchronicity, I came across Gajewski’s online posting at the same time I was reading Dr. Danielle Ofri’s new book, “What Doctors Feel,” about the undeniable – and sometimes overwhelming – role of emotion in the practice of medicine. Then, to clinch it, New York Times contributor Dr. Pauline Chen wrote this week about the growing problem of clinician burnout and how to lessen it.

It would be hard to find a health care provider in the U.S. these days who isn’t under tremendous stress at least part of the time. Although stress is inherent in the health care professions, it’s being ratcheted up and up and up by constant change, increasing demands on people’s time and energy, and resources that are becoming ever more strained.

While most Americans are probably aware of this, one has to wonder how well the average person understands the connection between frazzled, exhausted, unhappy clinicians and unsatisfactory patient experiences.

Dr. Chen lays it out for us:

Research over the last few years has revealed that unrelenting job pressures cause two-thirds of fully trained doctors to experience the emotional, mental and physical exhaustion characteristic of burnout. Health care workers who are burned out are at higher risk for substance abuse, lying, cheating and even suicide. They tend to make more errors and lose their sense of empathy for others. And they are more prone to leave clinical practice.

Do patients care? More than a few people might react the same way as one of Dr. Chen’s online readers who called it professional narcissism: “Enough about how physicians and other medical professionals’ lives are so difficult. We all have jobs that are hard to do.”

But I would hope this is a minority view. At its core, health care is about relationships between human beings. It’s hard for the parties to truly engage with each other when one of them is physically and mentally overloaded, frustrated and on the verge of burnout.

This isn’t to say that provider satisfaction should be the sole responsibility of patients. Patients often have enough to handle without also worrying about managing the morale of their health care team.

But patients do have a stake in this, whether they realize it or not. So, to go back to Gajewski’s question: Is there something we can do about it?

The hard life of kids

Alec Fischer’s documentary about bullying in Minnesota’s schools is only 45 minutes long but it clearly packs an emotional punch, as a local audience saw for itself last week.

During a showing of the film at Ridgewater College, the room grew hushed while photos flashed across the screen of students who killed themselves after prolonged bullying by their peers.

One of the filmmaker’s messages: Kids are singled out by bullies for many different reasons, and it won’t stop unless more people, both kids and adults, speak up about it.

From the vantage point of adults, childhood and adolescence can seem like a golden time. But there’s mounting evidence that this no longer is the case for all too many kids (if indeed it ever was).

Take the Minnesota Student Survey. Conducted every three years for students in grades 6, 9 and 12, it tracks them as they progress through middle school and high school. It’s seen as a good barometer for risky behaviors such as alcohol and tobacco use and early sexual activity.

But look at what students are reporting about their state of mind. In 2010, the most recent year in which the survey was administered, 8 percent of sixth-grade boys and 9 percent of sixth-grade girls reported feeling almost more pressure than they could handle at some time within the past 30 days. Among 12th-graders, 11 percent of boys and 19 percent of girls said they felt this way. Many kids also reported distressing symptoms such as frequent headaches and stomachaches, sleep difficulties and feeling unhappy or sad.

To be sure, these weren’t the majority. Most kids in fact seemed to be doing OK, and the vast majority said they liked school and that they had parents and other adult relatives who cared greatly about them.

It’s disturbing, however, that so many young people are experiencing high levels of stress and anxiety. Bullying, which seems to have become much more pervasive than a generation ago, is only one of the problems that students encounter.

A recent Associated Press story, which explored what some high schools are doing to reduce kids’ anxiety, noted that adolescents have a lot on their shoulders these days. School officials pointed to hectic schedules, academic overload and pressure to achieve, and kids spoke of days packed with nonstop activity. Here’s a typical day for Abbie Kaplan, a student at Boston Latin School:

On a scale of 1 to 10, she places her stress level at a pretty steady 9. She regularly has four hours of homework a night, some done before swim practice. She eats dinner around 9:30 p.m., then finishes the rest of her homework and generally goes to bed at 11:30. Then she’s up at 6 a.m. so she can be at school by 7:45.

She calls her hectic schedule “the new normal.”

“You keep telling yourself that it will prepare you for the future,” Kaplan says. “It’s just sort of how it is.”

She, too, has had anxiety attacks related to her workload, she says.

And a rising tide of stress among the younger generation was highlighted in yet another recent survey, this one by the American Psychological Association, that found stress levels have increased for Americans of all ages but are being felt most keenly by young adults. The study also found that younger adults seem to have more difficulty managing their stress and that their stress has increased in the past year.

Some of this may simply be how the human psyche matures and ages. Other studies have found that the years past middle age, when people tend to have accumulated life experiences and learned to cope with them, are often the happiest.

But as many observers have pointed out, the stresses faced by kids today are different than their parents’ or grandparents’ generations. The world is a more complex place than it once was, the economy is more difficult and the future more uncertain – and it’s all being intensified by the pervasive presence of the social media.

When kids are stressed and not managing it well, does it put them on the path to becoming tomorrow’s stressed adults, with all the unhealthy and potentially destructive behaviors this entails? While adults can’t always make the world an easier place for kids, acknowledging that there’s far more pressure on kids than there used to be seems like the first step toward taking this issue seriously and helping them develop coping mechanisms that can carry them into a healthier adulthood.

Too stoic for antidepressants

Should people who are depressed take antidepressant medication, or should they just tough it out?

There’s often a stigma surrounding the use of antidepressants, and it may be preventing people from getting the treatment they need, college student Leah Lancaster wrote this week in an insightful opinion piece for the Minnesota Daily.

Lancaster writes that she has been taking antidepressants since she was 15 years old – and that without them, she most likely would not have gone to college. “Yet, when the topic comes up, I often find myself defending my decision against accusations that I’m ‘numbing myself’ or ‘taking the easy way out,'” she writes. “Supposedly, if I did yoga, ate healthier and took a more ‘natural’ approach, I wouldn’t need to contaminate my mind and body with toxic pills.”

Some of the stigma surrounding depression itself seems to have eased in the last couple of decades. But when it comes to antidepressants, it can still be hard for the public to accept that for many people, medication may be necessary to help them feel better.

It’s hard to measure how widespread this attitude might be. It clearly exists, however, and one of the consequences is untreated depression. A study that appeared last year in the Annals of Family Medicine found that patients often don’t tell their primary care doctor that they’re experiencing depression. The No. 1 reason for this lack of disclosure? They feared being prescribed an antidepressant.

Even in the medical setting, patients often are reluctant to report that they take prescription medication for depression or anxiety, writes Mag Inzire, a physician assistant at a community hospital in New York.

The patients she encounters rarely worry about disclosing a history of diabetes or high blood pressure, she wrote. “Yet when it comes to depression or anxiety, there is some uncertainty in their response. And it always seems to follow by some long, drawn-out explanation as if to justify the diagnosis.”

Depression in fact is relatively common. In any given year, about 6.5 percent of the
American population will experience depression. Across a lifetime, about 16.2 percent of the population will have depression at some point. Stigma or not, antidepressants are one of the most frequently prescribed drug categories in the United States.

How antidepressants work in the brain, and whether they’re truly effective, is a matter for some debate. At one time it was thought that low levels of serotonin, a mood-enhancing chemical, were a trigger for depression, and that drugs such as Prozac, which raise the level of serotonin in the brain, would correct this. This theory has been called into question, though, and if continuing neuroscience study is any indication, the role of antidepressants is considerably more complex than this.

Why, for instance, does medication seem to be more effective for severe depression but less so for mild or moderate depression? Why do some antidepressant medications cause a worsening of depression in some people?

Studies have found that people with mild depression often do well with talk therapy alone. Other studies have found that a combination of medication and talk therapy is often most effective for mild to moderate depression. What does this mean for the role of talk therapy in treating some forms of depression?

Of the millions of antidepressant pills dispensed in the U.S. each year, some likely have been overprescribed to those who don’t really need them. “The reality is that many psychiatrists do give out pills too freely, and many patients start taking medications without properly researching them beforehand,” Lancaster writes.

But in her own case, antidepressant medication has made the difference between being able to function vs. withdrawing from life, she wrote.

Medication hasn’t been a cure for her. “No pills can do that,” she wrote. “What they can do is give you some energy and focus so you can make it through the day without feeling lethargic, irritable or just downright horrible.”

And she notes a double standard, at least in college-campus culture, of peers who view binge drinking, smoking, unprotected sex and “study drugs” as socially acceptable but believe antidepressants are “dangerous and mind numbing.”

“Like any medicine, antidepressants aren’t perfect,” she wrote. “But to make the sweeping generalization that all of them are bad is dangerous and prevents many from getting the help they need.”

Grief for the holidays

It’s hard to look at the calendar and not be reminded that Christmas Eve will mark exactly four months since my dad’s funeral. There’s going to be an enormous gap in the family holiday celebration this year, and in fact every year from now on.

But for what it’s worth, we are far from alone in having grief as an uninvited guest for the holidays.

Although the cultural expectation is that this is supposed to be a joyful time of the year, the reality is otherwise for anyone dealing with death, illness, financial difficulties, divorce, homelessness, or other forms of loss.

We shouldn’t need to be reminded of this, but somehow we often do anyway. And it seems many of us need outside advice on how to cope – or, for those who aren’t anticipating that their own holidays might be difficult, advice on how to be sensitive toward family and friends who are.

My email inbox has been filling up since October with suggestions on everything from getting through the holidays while undergoing cancer treatment to coping after a natural disaster. A half-hour on the Internet  turned up even more advice and insight, much of it from experts on grief.

If there’s one message to be gleaned from all this information, it would perhaps be this: Expect your emotions to be near the surface and expect that it will be hard at times, but concentrate on how you can make the holidays both manageable and meaningful in spite of what you’re dealing with.

Caroline Flohr, who lives in suburban Seattle and recently published “Heaven’s Child,” a memoir about the sudden death of her 16-year-old daughter, Sarah, has this to say: “Through the web of pain, I have been amazed by the power of family, love and faith in  healing.”

Have faith in your own inner strength and be appreciative of what you have, she writes.

From a grief counselor: Try to avoid comparing your situation with that of other people who are together and enjoying the holidays; no family gathering is perfect or stress-free.

Alan Wolfelt, the founder of the Center for Loss and Life Transition in Fort Collins, Colo., and a noted author and counselor, suggests that rather than allowing well-meaning friends and family to prescribe how they think you should spend the holiday, focus instead on what would be meaningful to you.

What about the thousands of people for whom health challenges will be an unavoidable part of the holidays? Deborah Cornwall, a leadership volunteer for the American Cancer Society and author of a new book, “Things I Wish I’d Known: Cancer Caregivers Speak Out,” sums it up this way: “Keep it festive. Keep it simple. Keep it social. Keep it positive.”

Having cancer or being a caregiver for someone with cancer (or any other major or chronic disease, for that matter) is often overwhelming, so look for normalcy, she advises. This might mean focusing on a few traditional activities, such as baking and decorating cookies, that are most important to you and skipping the rest. Make togetherness the priority – and find time to laugh, Cornwall suggests.

Those who haven’t yet experienced grief or illness or hardship during the holidays may want to be helpful but don’t know what to say or do.

Again, the experts come to the rescue with some important tips: Don’t judge. Don’t give advice that hasn’t been asked for. Be present and listen. Rather than waiting to be asked or making vague offers of help, take the initiative and offer to help in ways that are specific and practical, such as bringing over dinner or shoveling snow off the sidewalk.

In the days and weeks after Dad died, it was often the little things that mattered most – the cards, the phone calls, the neighbors who brought food, the people who took the time to share their memories of him.

Studies on coping with grief and adversity mostly point to the same conclusion: Support from other people matters, and an essential part of the recovery process is the construction of meaning out of loss. Even though the holidays are often a serious test of people’s emotional fortitude, at the same time it can be an opportunity for the sick, the struggling and the bereaved to become more resilient.

Tragedy, black humor and coping

A seemingly frivolous tweet about superstorm Sandy drew a sharp rebuke this week from the Michigan Nurses Association.

The offending tweet came from the Detroit News and actually was a retweet of what another Twitter user (a civilian, apparently) said about the storm: “BREAKING: Frankenstorm upgraded to Count Stormula.”

The nurses’ association tweeted back:

The organization’s communications director, Dawn Kettinger, then contacted media uberblogger Jim Romenesko to further press home the point. “People are hungry for information and connection right now,” she wrote. “Moments of misjudgment are understandable, but perhaps it is worth another discussion of how media use their resources and power.”

Is it ever OK to use dark humor when confronted with tragedy, whether it’s death, life-threatening illness or injury, or natural disaster?

It’s a little ironic that the “Count Stormula” criticism would come from someone in the health professions, a demographic that’s well known to engage in black humor to cope with the illness, injury, tragedy and stress they encounter on a daily basis.

There’s been little formal research on the use or frequency of black humor among health care professionals. The use of this type of humor seems to be most common in the higher-stress specialties: emergency medicine, critical care and surgery.

A small-scale study several years ago, involving 608 paramedics in New Hampshire and Vermont, as well as members of the National Flight Paramedics Association, found that almost 90 percent used gallows humor as a coping mechanism. In fact, it was their most frequent method of de-stressing, even more than venting with colleagues or spending time with family and friends.

The use of irreverent terms such as “acute lead poisoning” to refer to a gunshot wound or “celestial transfer” to describe patients who have died is so widespread that several online dictionaries have been compiled to list all of them.

Some studies even have found that black humor can benefit clinicians and patients by reducing tension and allowing the clinician to focus on the situation at hand.

But there seems to be a growing consensus that it’s not acceptable to resort to black humor within earshot of patients and families, or to frame the joke at someone else’s expense (and I daresay this applies to the media as well).

A group of Canadian researchers who explored the issue agreed that although laughter is often therapeutic, cynical humor that’s directed toward patients “can be seen as unprofessional, disrespectful and dehumanizing.”

Furthermore, the use of black humor to cope is frequently a learned behavior that medical educators need to address, they wrote:

“There is disturbing but compelling evidence that medical education and acculturation are partly to blame, by tolerating and even fostering a certain detachment and cynicism. Recent moves to encourage the development and evaluation of professionalism in medicine embrace concerns about this issue and the distinction between dark humor about the human condition and the particular observations of those who style themselves as healers.”

At its worst, black humor can stereotype or dehumanize the patient and make it harder to be objective or empathetic, wrote a nurse at Those Emergency Blues, a group blog of Toronto ER nurses. When this happens, doctors and nurses can end up providing poor care, she wrote. “The wisdom is having the insight to understand the sources of black humour in our own relative helplessness, and to recognize it, first, as an inevitable part of our practice, and secondly, as having a time and place.”

Readers, what do you think? When is black humor appropriate and when is it unacceptable?

The value of doing nothing

For one glorious day this summer, I did nothing.

Well, not literally nothing. The cat was fed, a couple of reasonably nutritious meals were prepared and the dishes were stacked in the dishwasher.

But I didn’t do anything that could remotely be defined as constructive. And it felt so, so good. So good, in fact, that I want to do it again.

Most of us, truth be told, don’t have days like this very often. Why not? Guilt, I suppose. Busyness. A sense of responsibility. Sheer necessity.

Like just about everyone else I know, my life has all these elements, and in spades. But for one day I managed to (mostly) suppress the feeling of I-know-I-really-should-be-doing-something-worthwhile and concentrate instead on the research: namely, that science tells us it’s good to switch off the circuits every so often and let our brains lapse into the calm, quiet space of nothing.

There’s biology behind this. The human brain is an incredibly complex organ, containing billions of nerve cells that process millions of information bits per second. A certain amount of stimulation is good. It’s well known, for example, that infants need physical, mental and emotional stimulation early in life for optimal brain development.

But if inadequate stimulation is bad, so is too much. And let’s face it, the American environment is overripe these days with stimulation, from background noise to occupational workloads to the perpetual presence of laptops and Twitter.

When the American Psychological Association conducted a survey of 1,200 adults last year, 39 percent said they were more stressed than the year before. Work-related stress was the major culprit, with relationships and health problems not far behind.

More tellingly, however, many people seemed to perceive this as normal. “Previously it was seen as something to be fought against,” Marie Dacey, of the Massachusetts College of Pharmacy and Health Sciences in Boston, told USA Today earlier this year. “People are not identifying stress as stress but as part of the norm. Our culture really applauds and rewards that.”

No wonder we often feel guilty for slacking off; it makes us look, well, like slackers.

There are bigger things to worry about than the guilt factor, though. Researchers have found that chronic stress, even the everyday kind, can have a negative effect on portions of the brain that govern impulse control, emotions and physiological functions such as insulin and glucose regulation. Over time, the cumulative impact can literally shrink the brain and leave it more vulnerable when a major stressor or adverse life event occurs.

Doing nothing allows us to step back from the constant stimulation and just let the brain relax. Thoughts, memories, impressions swim up aimlessly without the pressure to do something about them. The racing mind finally has a chance to slow down, and with this slowing down comes more clarity, more attentiveness to the here and now.

As a writer, I’m especially intrigued by some of the research suggesting that creativity and innovation are born out of those quiet times when the brain is meandering and then — eureka! — there’s a flash of insight.

Some fascinating long-term study by the Harvard Business School has found that people are usually most creative in the workplace when they have fewer distractions and interruptions, have deadlines but not intense time pressures, and can allow their ideas to “incubate” while working on other tasks.

Doing nothing may look superficial on its surface, but underneath this tranquil surface a whole lot of beneficial things are happening — and that’s not nothing.

Will the mental health stigma ever end?

My calendar today says it’s Aug. 8, 2012. That’s more than a decade into the 21st century. So why does it still often feel we’re back in the Dark Ages when it comes to mental health?

The latest evidence that the stigma surrounding depression and substance use remains alive and well was demonstrated this week in the petty politicking of Sen. Mike Parry of Waseca, Minn., who portrayed Minnesota Gov. Mark Dayton as a pill popper and “scary.”

You can say this for the critics: They were immediately all over Parry like glue on a cheap campaign bumper sticker. Maybe there’s more enlightenment out there than I think.

But would that this were about politics alone.

I don’t know Sen. Parry, I wasn’t at the Brown County Republican fundraiser where the remarks were made, and I wouldn’t presume to guess what he was thinking. But given that Gov. Dayton has been open about his depression and alcoholism, both of which have been treated, it’s hard to see this as anything other than a barb aimed squarely at the “weakness” implied by taking medication for a mental health issue.

And it’s shameful, because the last thing we need is to perpetuate the negative attitudes and judgments that often make it so hard for people to get help.

I had the opportunity last week to visit with some folks at Chippewa County Montevideo Hospital and Medical Clinics, here in west central Minnesota, about a community survey they’re launching this month to learn more about local attitudes regarding mental health and substance use.

The health providers in Montevideo have been working hard in the past few years to get better at identifying people who may have issues with depression and unhealthy substance use and connecting them with sources of help. What they’ve discovered, however, is that it’s often difficult to even begin the conversation. Patients are uncomfortable talking about it; sometimes doctors are uncomfortable too.

It’s hoped that the survey will give Montevideo health providers a better understanding of how their community feels about depression and substance abuse and lead to some strategies that will help.

People don’t get better when these issues stay in the closet. But to admit to being depressed, or to having a problem with alcohol, is to risk opening up oneself to negative judgments by the misinformed. The fact that a public figure would even go there – and on the record, no less – makes it painfully clear that the fear of being stigmatized is well founded.

Consider a poll conducted in 2004 in Tarrant County and Fort Worth, Texas, that found two out of five of those surveyed believed anyone with a history of mental illness should be barred from public office, more than 40 percent believed major depression was the result of a lack of willpower, and fully 60 percent thought “pulling yourself together” was an effective treatment for depression. Or a 2009 study in the Medical Care journal that concluded fear of stigma and how their parents would react was a major reason why adolescents didn’t seek treatment for depression. Or recent research suggesting that one in three U.S. soldiers with post-traumatic stress syndrome doesn’t seek treatment because of the stigma.

Really, the Parry incident shouldn’t be worth the ink being spent on it, except for one thing: the message that is sent by looking the other way.

Untreated depression and substance abuse take a toll on people’s lives, on their quality of life and their human potential. They take a toll on society. The consequences are serious indeed: Of the thousands of Americans who die by suicide each year, the vast majority have an untreated or ineffectively treated mental illness, most commonly depression.

Can we afford to allow the misinformation and stigma to continue? I don’t think so. Please get the facts and get educated. If you can’t or won’t, then at least have the grace to just keep quiet.

The traumatic side of illness

When I read last week about a new study that found heart attack survivors can be at risk for post-traumatic stress disorder, I confess to a rather cynical reaction: “You mean it took this long to recognize that heart attacks are traumatizing for some people?”

The research, which was published this month in the PLoS One journal, analyzed two dozen smaller-scale studies and concluded that about one in eight survivors of a heart attack develops signs of PTSD – and that these people can be at higher risk of a second heart attack.

We’ve become better at recognizing PTSD among the military, especially combat veterans, although it’s debatable whether we’re any better at providing help to those who need it. We’re also better at recognizing that PTSD can occur among survivors of a natural disaster, a fire, an assault or other immediately life-threatening event.

But it seems we have much farther to go when it comes to recognizing that trauma also can be part of people’s medical experiences.

There’s no denying that patients sometimes struggle with anxiety, intrusive thoughts and panic attacks in the wake of a major illness or a hospitalization. Consider the mom who wrote last year to an online parenting advice column about her toddler, who had recently been hospitalized, catheterized and poked repeatedly with needles and IVs. Two weeks after coming home from the hospital, he was clingy, fearful, frightened of unfamiliar places and people, and having trouble sleeping.

Did he have PTSD? his mother wondered. And what could she do to help him?

Post-traumatic difficulties have been reported among both heart and cancer patients, as well as among people who have been hospitalized in intensive care units. They also can occur among patients who’ve experienced a medical injury.

The medical community has been slow to take this issue seriously, however. In fact, there doesn’t even seem to be a consensus on how prevalent it is or the most reliable way to diagnose PTSD stemming from the medical setting. One recent study put the incidence of PTSD at anywhere from 4.8 percent to 29.2 percent of cardiology and oncology patients and concluded that more study is needed.

The PLoS study in fact appears to be the first time that researchers have systematically attempted to more closely examine PTSD among cardiac survivors. The authors write:

In recent years, cardiologists and the broader medical community have become increasingly aware that psychological disorders, particularly depression, are common in patients with [acute coronary syndromes] and are associated with adverse clinical outcomes. Even so, abundant evidence suggests that psychological disorders are underrecognized and undertreated in cardiac populations. While awareness of depression has increased in cardiology practice, awareness of the possibility of PTSD due to ACS has lagged, possibly because many still see PTSD as primarily a disorder of combat veterans or sexual assault survivors.

A commenter who responded to a New York Times news story about the study described her own experience with severe heart disease and how she’s “never been the same since.”

“I suffer from extreme anxiety, hypervigilance and a host of other psychological symptoms related to my cardiac history,” she wrote. Her attempts to relate this to her doctors have “fallen on deaf ears.”

“Now that I have read this article I am hopeful that maybe I will get the help I so desperately need to rebuild my life and rid myself of this insidious fear and stress,” she concluded.

Why has this area of the patient experience been poorly addressed for so many years? Perhaps the medical community has had trouble coming to grips with the fact that their actions, carried out with every intention of helping the patient, can be experienced as upsetting, frightening and traumatic. Perhaps we just have a hard time thinking of PTSD in any context other than combat or violence – and indeed, there seem to be some differences between the two, not to mention a dearth of research on acute medical trauma. Does it have the same fundamental effect on the brain as combat- or violence-related trauma? Or is medical PTSD a separate entity, with different forms and different causes?

One of the most troubling aspects of medical PTSD is that it can leave people fearful and avoidant of anything to do with the medical setting – or, conversely, hyper-anxious about the need to be tested and reassured. Realistically, the avoidant ones will not be able to get through the rest of their lives without ever setting foot in a hospital or doctor’s office again, and the anxious ones are all too easy for busy clinicians to dismiss as being high-maintenance. Researchers, clinicians and the mental health community owe it to these individuals to understand what may be going on in their heads and to find ways to help reduce their suffering.

The loneliness factor: social connections and health

When it comes to health, loneliness seems to matter, suggests a new study that found a connection between feeling lonely and shorter lifespans.

The study appeared this month in the Archives of Internal Medicine and focused on a specific population: adults over the age of 60. As study participants were tracked over six years, those who self-reported feelings of loneliness were more likely to experience “functional decline” – i.e. decreased mobility, difficulty with activities of daily living and so on – and also died sooner.

The study piggybacks on another piece of research, also published this week in the Archives of Internal Medicine, that found increased mortality among adults who had heart disease or were at high risk of heart disease and who lived alone.

So does this mean loneliness is a threat to health? Some of the headlines have been blunt in concluding this – “Being lonely can kill you, studies say”; “The grim impact of loneliness and living alone,” to name a few.

But it’s not that clear-cut. Moreover, the studies don’t really explain why there would be an association between loneliness/being alone and decreased lifespan. Are these people depressed, hence not taking care of themselves? Is there no one looking out for their well-being, reminding them to take medications, advocating on their behalf at doctor visits, and so on? Does loneliness have a direct and as-yet undetermined effect on the immune system or the aging process?

There’s a difference between living alone and being lonely. In the study that looked at loneliness as a predictor of poorer health as people age, the authors draw this distinction: “Loneliness is the subjective feeling of isolation, not belonging, or lacking companionship… It is possible for persons who live alone to not feel lonely, while some who are married or living with others will still experience loneliness.”

In fact, the majority of the study participants who reported feeling lonely did not live alone. And most of those who said they were lonely were not depressed.

Given the growing number of Americans who live alone (27 percent of all U.S. households, according to the latest census), this is an area that seems ripe for study.

A number of previous studies have explored the whole issue of social connectedness and whether it has a positive influence on health. The findings have been rather intriguing: Church-going has been linked to better health, as has social support in the form of friendships and regular contact with other human beings. Matrimony seems to be beneficial too, although the benefit is strongest for men and less so for women.

But is this because relationships and social engagement have a direct effect on how healthy we are? It could easily be argued that the people who are socially active and go to church every Sunday are simply healthier to begin with and hence more able to engage in social activities. Conversely, functional decline associated with social isolation among older adults could be a cause rather than an effect; perhaps as people decline, they’re less able to mingle with the rest of the world and are more likely to become socially isolated.

It would also be interesting to know how much of this is tied up in what seems to be an American bias in favor of extroversion and youth. Do we expect people to want and seek out companionship and look for negatives when they don’t measure up? Do we assume loneliness and aging go hand in hand?

Teasing out exactly what’s going on here will probably take much more study. Feeling lonely vs. living alone might be two separate issues with dynamics that need to be considered in different ways. How we perceive purpose and connectedness may matter too; someone who seems to be alone might not feel lonely and might in fact be leading a rich existence that derives meaning from other sources – for instance, the arts, or pets or the outdoors.

The real message may be that psychosocial health deserves far more attention than it often receives.

Last year I wrote a story about a survey conducted by Home Instead Senior Care to identify mealtime challenges for older adults. So what was at the top of the list? The expected answer might have been the practicalities of preparing a meal, or access to fresh fruits and vegetables, or the affordability of quality food. But nope, it wasn’t any of these – what the folks who participated in this survey wanted most was companionship at mealtimes. And this was borne out when I went out and actually talked to some older adults who’d had experience both with living alone and with living in an assisted living facility where they had meals together. Virtually all of them preferred eating with someone else for company.

In the medical setting, these things tend to be assessed briefly or not at all. And indeed, primary care doctors often are so pressed to manage the medical aspects of the patient’s care that they may not have time to ask patients if they’re lonely.

But as the authors of the Archives of Internal Medicine study point out, loneliness for many older people “may be more distressing than their medical diagnoses.” With referral to the right kind of intervention, it might help slow or delay a decline in health for higher-risk individuals, the authors suggest.

Ask patients if they’re lonely, the authors urge. “Ultimately, by asking about psychosocial concerns important to patients, our treatment focus may shift, and we will likely enhance the physician-patient relationship. By identifying loneliness we will be better able to target interventions intended to prevent functional decline and disability.”

The Hatfield-McCoy feud: What made them do it?

I was glued to the History Channel last night, watching the first two hours in the “Hatfields and McCoys” original miniseries.

It’s a grim story, based on two actual clans in 1800s Appalachia who collide over economic rivalry, social change and plain old cussedness. Last night we saw the beginnings of the feud; look for the body count to pile up even more in the next two installments tonight and Thursday.

Where does that kind of rage come from? Did the fighting represent how most mountain families in the post-Civil War era settled their differences, or were the Hatfields and McCoys a departure from the norm?

Plenty of scholars have asked these same questions.

One of the more intriguing possibilities that has only emerged in recent years: The McCoy family carries a rare genetic disorder that causes adrenal tumors, leading to the release of adrenaline and other compounds that might predispose someone to anger, outbursts and perhaps violence. Left untreated, it can be fatal.

Several modern-day descendants of the McCoys have the disorder, known as von Hippel-Lindau disease, lending considerable credence to this theory. Vanderbilt Magazine published an article about it in 2007 after treating an 11-year-old girl who’s descended from the McCoys.

Dr. Revi Mathew, associate professor of pediatrics at Vanderbilt University Medical Center, described the symptoms: “It does produce hypertension, headache and sweating intermittently depending on when the surge of these compounds occurs in the bloodstream. I suppose these compounds could possibly make somebody very angry and upset for no good reason.”

The Coal Valley News of Madison, W. Va., expanded further on this theory with an interview with Dr. Coleman Hatfield of Stollings, W. Va., a great-grandson of patriarch William Anderson “Devil Anse” Hatfield and family historian.

Dr. Hatfield told the Coal Valley News that he had long been puzzled about the feud. “I always thought it odd how Ran’l McCoy could so easily go into a rage over seemingly inconsequential incidents. Perhaps his temper concerning ‘that damnable pig’ could better be explained partly due to a disorder or disease. His volatile mannerisms, and his inability to let go of his anger, didn’t always seem rational or reasonable.”

Genetic researchers apparently have known for years about the McCoy family’s susceptibility to von Hippel-Lindau disease. One researcher reportedly traced it through four generations of the clan. It wasn’t until 2007, however, that the information became public knowledge, mostly to alert other McCoy relatives of their risk.

Whether genes can be blamed for a feud that lasted nearly three decades (1863-1891) and claimed 13 lives is highly debatable, of course. The Hatfields participated in the violence too, and there’s no evidence of von-Hippel Lindau disease or any similar disorder on their family tree.

What about post-traumatic stress among the clan’s two leaders, Devil Anse Hatfield and Randall McCoy, who were both Civil War veterans? Kevin Costner, who plays Anse Hatfield and also co-produced the miniseries, offers this as one of the possibilities to help explain their behavior. “I think both men suffered from post-traumatic stress syndrome,” he told the Fresno Bee last week. “Both guys came back to their families with millions of images in their heads.”

The role of moonshine in the region’s culture, its availability and the potential for alcohol abuse probably didn’t help either.

Altina Waller, a historian and professor at the University of Connecticut who spent 10 years researching the Hatfield-McCoy feud, told the Wall Street Journal that the real force underlying the conflict was likely a complicated brew of economics, social change, industrialization and politics. Americans back then may not have known about the economic changes taking place in the valleys of eastern Kentucky and West Virginia in the wake of the Civil War, but their imaginations were captured because they “saw in the feud their own anxieties about family cohesion and family violence,” Waller explained.

Then as now, the roots of violence and conflict seem to be a tangle of physical, mental and social factors that aren’t easy to pin down.

Photo: Kevin Costner as William Anderson “Devil Anse” Hatfield, courtesy of History Channel.