Downsizing all the stuff

If you’ve ever had to downsize your belongings or help a parent or aging relative with downsizing, what you’re about to read will come as no surprise.

Most of us possess an enormous quantity of stuff. Stuff we don’t use and don’t need, yet continue to hang onto for whatever reason.

Although consumer spending looms large in the American economy, there’s been little research on what we actually do when it comes to keeping and disposing of our many possessions, especially as we get past middle age.

A recent study in the Journals of Gerontology: Series B tackles this issue and concludes that although many older adults find comfort and identity in the things they’ve accumulated during their lifetime, the sheer quantity can become burdensome – not just to the owner but also to relatives who might eventually be forced to deal with all the excess belongings.

The study also turned up a finding that could make many of us think twice before procrastinating on cleaning out the attic or the closets: Past the age of 50, we become less and less likely to dispose of our possessions.

The researchers drew their findings from the 2010 Health and Retirement Study, an annual survey of Americans 50 and older that for the first time included four questions about what people did with their belongings. What they found: 30 percent of respondents who were over age 70 had done nothing in the previous year to clean out, donate or dispose of excess belongings, and 80 percent had sold nothing. Even after controlling for various factors such as widowhood or a recent move from a house into a small apartment, activities to reduce what the authors call “the material convoy” consistently grew less with advancing age.

Yet more than half of the survey respondents in all age categories felt they had too much stuff. Among those in their 50s, 56 percent believed they had more possessions than they needed; for those in their 70s, it was 62 percent.

(To be clear, we’re talking here about the ordinary lifetime accumulation of possessions, not hoarding.)

One need only look at the proliferation of articles and advice about downsizing to recognize this as a growing social issue. What happens to all the belongings we no longer need or have room for as we age – does most of it end up in landfills? What about older adults who have a large accumulation of stuff they haven’t disposed of – does it become a hindrance to moving into a living situation that might be safer or more manageable? Who makes decisions about the stuff if the owner is unable to do it himself or herself?

Perhaps this is a problem unique to the current older generation, folks who came of age during the Depression and who tend to save and reuse things instead of throwing them away. Then again, the increasing square footage of the average American house (1,740 square feet for the average new single-family home in 1980; 2,392 square feet for the same home in 2010) suggests that with more space, younger families may fill it with more belongings than their parents ever had, thus perpetuating the problem when it’s eventually their turn to downsize.

Perhaps declining health and reduced stamina simply make it harder to deal with excess stuff as we get older, regardless of which generation we belong to. There’s an emotional aspect as well to parting with items that we’ve lived with for many years.

How we accumulate and dispose of our possessions might seem like a purely personal issue but bigger issues are at stake, the study’s authors wrote. While we may treasure our belongings, the stuff can also become burdensome, especially as we age and our vulnerability increases. “To the extent that possessions – not just single, cherished items but in their totality – create emotional and environmental drag, individuals will be less adaptive should they need to make changes in place or even change place,” the researchers wrote. “The material convoy is not innocuous, mere stuff; its disposition must be undertaken sooner or later by someone. The realization of this is why the material convoy – personal property – becomes an intergenerational or collective matter.”

Additional reading on downsizing an aging household:

20 tips to help you get rid of junk

Family things: Attending the household disbandment of older adults

Tips to help you get a grip on downsizing possessions

Learning to give bad news: the patient’s perspective

After Suzanne Leigh’s daughter, Natasha, died from a brain tumor, her mother wrote an essay about “What I wish I’d said to my child’s doctor.”

One of her pieces of advice was to ban junior doctors from sitting in while sensitive, emotional conversations took place. She wrote, “When emotions are high and we are at our most vulnerable, we don’t want to be politely scrutinized as if we were lab rats – even if it means that those in training might lose out on a one-of-a-kind lesson in patient-doctor communications.”

My last two blog posts were about training doctors to deliver bad news and the benefits (or not) of scripting to help them through a difficult conversation. Where are patients supposed to fit into this? Suzanne raises the human side of the medical teaching process – that patients and families are often drafted into being practice cases for students, many times willingly but sometimes with awkwardness, reluctance or downright resentment.

It’s not an easy balance, though, between respecting the needs of patients and families at a supremely vulnerable time while creating opportunities for students to learn and gain skills. And it’s this balancing act that touched off a contentious online debate after Suzanne’s essay was reposted late last week on Kevin, MD.

It started with a medical student who wanted to make it known that although the patient’s and family’s wishes should come first, observation is one of the ways that students learn the difficult art of giving bad news.

It is not about scrutinizing the parents during the conversation; rather it is about seeing our mentors perform an essential skill that we will very shortly have to put into practice on our own. When patients and families are comfortable allowing us in on those conversations, we are immensely grateful, as it is really the only way we can learn an immensely difficult skill.

A blanket ban on junior doctors? One commenter called it “tantamount to saying ‘I want my child to benefit from the superior care that is available in a teaching hospital, but I think an important aspect of that teaching should be abolished, since it made me personally feel uncomfortable.’”

As you might guess, the discussion started to get lively.

Knowing when to refrain from judging and graciously leave the room is a learning experience too, wrote one commenter.

Should the need for training ever outweigh a family’s discomfort with having an audience during a difficult conversation? wondered someone else. Is it fair to lay guilt on people for saying no to the presence of a doctor-in-training in the room?

Another commenter, apparently a doctor, suggested that a family who requests privacy during one of these conversations “probably has a lower level of coping skills.” (What? Maybe privacy is just their preference.)

I’m not sure there ever can be a consensus on the right way to do this, other than that it’s a tough balance between the needs of the individual and the needs of society, in this case the value of a doctor who has learned to communicate well and skillfully when bad news is involved.

What’s consistently missing is the voice of patients and families themselves.

Doctors in training may learn from observing more experienced clinicians during a difficult conversation but this is only one side of the transaction. How was it perceived by the patient and family? After all, what seems empathetic to an observer might not meet the patient’s definition of empathetic. On the other hand, there’s no one-size-fits-all approach to the art and skill of delivering bad news, so how do doctors learn to tailor what they say and how they say it without unintentionally giving offense?

Medical students might learn a lot from patients and families about giving bad news – what makes the situation worse, what makes it better. If we need a better balance between real-life training opportunities for students vs. allowing difficult conversations to belong to the patient and family, why not invite patients into the education process instead of debating their obligation to do so?

Emotions by rote: when empathy is scripted

Something bothered me last week while blogging about teaching medical students to deliver bad news to patients.

Although training and practice can help develop and reinforce effective, empathetic communication skills in medical students as well as doctors, I kept having the nagging thought that this wasn’t the whole story. Is delivering bad news merely about following the six steps of the SPIKES protocol? Would I want a conversation involving bad news to be packaged as a carefully learned formula? Don’t patients sense the difference between rote platitudes and genuine caring?

Then… eureka! an article appearing in the New York Times the same day captured it exactly.

Dr. Timothy Gilligan, co-director of the Center for Excellence in Healthcare Communication at the Cleveland Clinic, and Dr. Mikkael Severes, director of the leukemia program at the Cleveland Clinic, nailed it: Doctor-patient communication isn’t something you can readily force or script or reduce to “10 Easy Techniques For Demonstrating Empathy.” Although good communication is an essential skill for any health care practitioner, it should be real, not faked.

From their article:

No communications course will magically transform lifelong introverts to hand-holders and huggers. At the same time, we must ensure that we are not converting people who genuinely care about their patients into people who only sound as if they care. Having physicians sound like customer service representatives is not the goal.

For those doctors who are emotionally challenged, communications courses can provide the basics of relating to other human beings in ways that, at the very least, won’t be offensive. But for the rest of us, we should take care to ensure the techniques and words we learn in such courses don’t end up creating a barrier to authentic human contact that, like the white coats we wear, make it even harder to truly touch another person.

As one of those “lifelong introverts”, I take exception to the implication that introverted clinicians need fixing so they can give more hugs. Introversion and good communication are not mutually exclusive, and introverted doctors may in fact be better than many extroverts at listening to their patients.

But Drs. Gilligan and Severes raise important questions about the degree to which empathy can be taught and the unintended consequences of trying to program clinicians into better communicators.

Reactions to their article were almost as fascinating as the article itself. A sample:

- “Patients usually come to us because they hurt. They’re suffering, and most of their suffering isn’t from tumors or low platelet counts; it’s from their own normal emotions in response to being sick: depression, anger, confusion, loneliness, anxiety, and so on. We might be able to fix their lesions, but we can address their emotions only by listening to them, and that’s a more sublime skill than repeating, ‘Go on’ or ‘How does that make you feel?’”

- “Providing education and skill sets to improve communication will only go so far. Addressing issues around emotional intelligence is the core of the problem.”

- “As a physician, I find It amazing that anybody could write about problematic interactions between doctors and patients without noting the 2,000 pound gorilla in the room: what our healthcare system pays doctors for. The less time doctors spend with each patient, the more patients they are able to see and the more money they can make. If you pay doctors the same whether they talk to patients for five minutes or 45 minutes, guess which they are most likely to do.”

- “Most of these technically brilliant but interpersonally stunted doctors are either married, have friends, and successful careers, meaning that they must have some social skills. I’m not sure the issue has anything to do with being introverted or geeky. I think it might have a lot to do with arrogance and not seeing the patient as a person.”

I’ve heard some health care workers say scripts help them stay focused, especially in difficult situations. Others find scripts restrictive, even a bit silly, and would rather allow the conversation to flow naturally.

What do you suppose patients and families prefer – empathy that’s clumsily expressed but sincere, or all the right words with no feeling behind them?

Learning to give bad news

Bad news awaited the patient as she recovered from surgery… but the first person to enter her hospital room while she anxiously waited for the pathology report wasn’t the doctor, it was a medical student.

Could this situation have been handled better? Kyle Amber, a fourth-year student at the University of Miami Miller School of Medicine in Miami, Fla., writes about the experience in a recent issue of Academic Medicine, the journal of the Association of American Medical  Colleges:

“Is it cancer?” she asked, as if completely unfazed by the three hours she waited to be seen at this busy clinic that treats patients with poor access to care. All I could reply was, “The resident will come into the room in a few minutes to discuss the findings with you.” Let down, she nodded and told me she understood. Even as I attempted to comfort my patient, I knew it would take quite some time before a resident would be able to see her in this busy teaching clinic. An hour after I had left the room, the resident walked into the room, introduced himself to the patient, and told her that the pathology demonstrated a widely disseminated Stage IV cancer.

If the patient is about to receive bad news, the first person in the room should not be a medical student, he concludes. “This is no way to deliver bad news.”

Breaking bad news to patients and families is one of the most difficult tasks for a doctor to undertake. It’s stressful, and few physicians ever look forward to it or truly become comfortable with it. Yet at some point during their training, they have to confront it.

How do you teach this skill to medical students? There’s a fair amount of debate in medical academia over whether empathy can be taught. Students can learn and practice communication skills, however, that help increase their comfort level as well as their competency. The evidence also suggests that training can help instill awareness and behaviors that make it less likely for bad news to be delivered in ways that are insensitive or devoid of compassion.

At one time, doctors received little if any formal training in how to communicate bad news. They mostly learned by watching more seasoned mentors carry out this task. Hopefully they saw good examples, but maybe they didn’t. These days, medical schools are recognizing it’s an important skill for students to learn and most training programs have incorporated it in their curriculum.

But there seems to be an unwritten rule that medical students should never, ever be the bearer of bad news in a real-life clinical setting. That responsibility lies with more senior colleagues who are directly involved in caring for the patient.

It’s tough, then, for a medical student to be placed in the position of being with a patient who has been waiting for hours to hear the outcome of her surgery, yet be unable to do anything about that patient’s most immediate need – the need for information. Sometimes a student’s most memorable lesson is about how to handle things differently next time.

Beyond the red dress

Think “pink” in relation to health issues, and breast cancer immediately springs to mind. Think “red” and… well, many people probably will recognize it as the symbol for women’s heart health, especially if it happens to be February, but plenty of folks might come up blank.

Are we so conditioned to fear breast cancer that we can’t fully recognize or appreciate the risk that heart disease poses to women’s health? It’s an intriguing question posed by cardiologist Dr. Lisa Rosenbaum in the New England Journal of Medicine last week.

In the past decade the American Heart Association’s Go Red for Women campaign and similar educational efforts have greatly increased women’s knowledge about heart disease, Dr. Rosenbaum writes. More recently, though, the gains have mostly leveled off. Gaps in knowledge also persist among minority women, who often are at higher risk of heart disease.

Dr. Rosenbaum wonders: What’s needed here – more facts or a greater effort to address women’s emotions?

There seems to be something visceral about breast cancer that taps into women’s fears in ways that don’t happen with heart disease, Dr. Rosenbaum writes. She speculates that maybe it’s connected with female identity and thus resonates with women very deeply. Against this, perhaps it’s harder for women to really engage with heart disease, she suggests.

Dr. Rosenbaum writes:

Have pink ribbons and Races for the Cure so permeated our culture that the resulting female solidarity lends mammography a sacred status? Is the issue that breast cancer attacks a body part that is so fundamental to female identity that, to be a woman, one must join the war on this disease? In an era when women’s reproductive rights remain under assault, is reduced screening inevitably viewed as an attempt to take something away? Or is the issue one of a tragic story we have all heard — a young woman’s life destroyed, the children who watch her suffer and are then left behind?

On the other hand, what is it about being at risk for heart disease that is emotionally dissonant for women? Might we view heart disease as the consequence of having done something bad, whereas to get breast cancer is to have something bad happen to you? In a culture obsessed with the “natural,” are risk-reducing medications anathema to our vision of healthy living? Or are we held up by our ideal of beauty? We can each summon the images of beautiful young women with breast cancer. Where are all the beautiful women with heart disease?

There’s certainly food for thought here. One way of encouraging women to rethink their perception of heart disease risk might be more emphasis on the message that heart disease isn’t always caused by unhealthy lifestyles, Dr. Rosenbaum suggests. Maybe women need to be reminded that a “natural” approach isn’t necessarily better than taking medication to reduce their cholesterol or blood pressure.

And maybe, she suggests, “we can try to move beyond disease wars toward the creation of communities of women in which stories about living with heart disease are as celebrated as stories of surviving breast cancer.”

Here’s a place to start: a collection of survivor stories from WomenHeart, a national coalition for women with heart disease. And from the excellent Heart Sisters blog by heart attack survivor Carolyn Thomas, here are several first-person stories from women who describe openly and frankly what their heart attack was like.

Competing our way to better health

Reactions to the finale of the latest season of “The Biggest Loser” seem to fall into two general categories:

1) Rachel Frederickson, who shed more than half of her body weight to win the competition and take home $250,000, looks fabulous and deserved the prize for her hard work.

2) Rachel lost too much weight too rapidly and is a prime example of everything that’s toxic about how Americans view food behavior and entertainment.

Amid all the chatter, not to mention the snark, that has swirled around the show for the past several days, there’s one point that seems worthy of extended, thoughtful public discussion: What are the ethics of creating competition around people’s health?

Treating individual health as fodder for a contest is by no means confined to “The Biggest Loser.” Corporate and community initiatives abound that pit individuals and teams against each other to see who can lose the most weight, walk the most steps, exercise the most hours and so on. The popularity of “The Biggest Loser” has spawned imitation contests in workplaces, neighborhoods and even churches.

What’s the harm, as long as it helps motivate people to change their behavior? Well, not so fast.

The competitive approach may be successful for some, and it may inspire many on-the-fencers to become more engaged. For a story last year on corporate wellness trends, the Washington Post talked to an office worker from Maryland who joined a corporate fitness challenge and lost 42 pounds. “There’s sort of like a peer pressure and a competitiveness to it,” Sal Alvarez told the Post, adding that he found it very motivating.

Not everyone responds well to competition, however, and some people may even find it alienating, especially if they feel coerced to participate.

Although there are plenty of positive stories about how wellness challenges have helped people change their lives, there’s surprisingly little solid evidence of competition’s overall effectiveness as a health improvement strategy. More importantly, it’s not clear whether competition leads to behavior change that’s sustained after the contest is over.

Then there’s the matter of prizes. Are people motivated by the prospect of T-shirts, water bottles and gift cards? Do they need carrots to entice them to take action or is it better to emphasize good health as the reward? What if the carrot is really, really significant, such as the quarter of a million dollars dangled before the Biggest Loser contestants?

With that amount of money at stake, along with a chance to be on national TV and impress their family and friends, it shouldn’t be surprising that the participants in the show would become uber-competitive, possibly to the point of going overboard.

In a recent interview with People magazine, Rachel herself conceded that she “maybe” was “a little too enthusiastic in my training to get to the finale.”

Maybe competition is OK, as long as it helps people accomplish their health goals. Then again, maybe it exploits people’s vulnerabilities and propels them into doing something perhaps the wrong way or for the wrong reasons. Does the end justify the means? That’s the moral question that hasn’t really been answered.

The ‘shiver’ diet? Don’t we wish

We hardy inhabitants of the Snow Belt have joked for years about the calories we (theoretically) burn from shivering our way through winter.

But wait – could there be scientific evidence supporting shivering as a hot new form of winter exercise?

Apparently so, according to the New York Times, which reported today on a study suggesting that shivering boosts the metabolism in the same way that exercise does.

Study participants were brought into a lab on three different occasions. During the first two sessions, they were told to exercise on a stationary bike in an indoor temperature of 65 degrees, and samples of their blood, fat and skin cells were obtained. For the last session, the participants were instructed to lie down, lightly clad, for half an hour while the indoor heat was reduced from 75 to 53. Their skin and muscle reactions were measured and samples taken again to see what happened.

Lo and behold, the study subjects produced the same amount of irisin, a hormone involved in the conversion of white fat cells to more desirable brown fat, from shivering as they did from exercise.

From the article:

What seemed to matter, the researchers concluded, was not the exertion of the exercise, but the contraction of various muscles, which occurred during shivering as well as cycling.

In view of the fact that the temperature this morning on my way to work was 0 and most of Minnesota is under its umpteenth wind chill advisory of the season, readers will have to excuse me for not climbing enthusiastically onto the shivering-as-a-form-of-exercise bandwagon.

A variety of studies have shown that we do indeed burn more calories when we’re trying to stay warm. But whether this is an appropriate substitute for exercise is debatable, especially since the study described by the New York Times only involved 10 people – hardly enough to build a strong scientific case. And the article does in fact offer an important caveat: There seems to be no evidence that working out in the cold helps rev up the production of irisin anymore than exercising in warmer temperatures.

In a sentence that could only have come from someone blithely unaware of the dangers of frostbite, the author concludes that if you can’t get to the gym, “at least consider lingering outside at the bus stop and shivering.”

No, thanks. I’ll take indoor exercise over lingering at the curb with a minus 25 breeze in my face any day.

Tribune photo by Ron Adams

Dry skin blues

There’s a bottle of fish oil capsules sitting on our kitchen counter. From the veterinary clinic. For the cat, who’s been experiencing the same misery as a lot of humans during this long winter: dry, itchy, flaky skin.

Dry skin doesn’t get much respect. After all, it’s not as serious as, say, heart disease. But as anyone who’s ever had it can attest, it’s certainly uncomfortable to live with.

Dry skin (the medical name for it is xerosis) can have many causes, some more difficult to address than others. Most of the time and for most people, though, the cause is environmental – and if it happens to be wintertime, you can usually blame it on constant exposure to dry air, both indoors and out.

The consensus seems to be that we function best with an indoor relative humidity of 40 to 60 percent. But during the winter, indoor humidity is usually much lower – around 20 percent in most buildings (and in fact it shouldn’t be much higher than this if you want to avoid problems with condensation). Unfortunately, the lower the outdoor temperature drops, the lower the relative humidity inside your home, workplace or school and the greater the chance that your hide will feel dry and uncomfortable.

Signs of dry skin can range from tightening, roughness and flakiness to fine lines and cracks. In more severe cases, deep painful fissures can develop that may become infected if bacteria are introduced – no small matter for health care workers and others who deal with chronically dry hands from washing them many, many times in a single day.

What to do? Here’s some advice from the Mayo Clinic:

- Choose an emollient-rich moisturizer and apply it several times a day. Products with petroleum jelly also can help but they tend to be greasy so you might want to limit them to nighttime use.

- Avoid harsh soaps. Cleansing creams, bath gels and soaps containing added moisture or fat are better choices.

- A long hot bath or shower may sound tempting after being outdoors in subzero weather but beware – hot water can strip valuable moisture from your skin. Stick to warm water and limit your time in the shower. After bathing, lightly pat your skin dry, then immediately apply a moisturizer; doing so will help trap water in the surface cells of your skin.

- Use a home humidifier during the dry winter months (but be sure to follow the manufacturer’s instructions for proper cleaning and maintenance).

So what about fish oil? Although consumption of omega 3 fatty acids has some benefit to heart health, there’s little proof it does much for dry skin, at least in the human species. There’s one small study that found fish oil supplements helped reduce itching among patients undergoing hemodialysis. Another rather interesting study on orange roughy oil found that it lessened dry skin – but again, this was a very small study and the fish oil was applied directly to the skin, not consumed in a capsule.

Thankfully the cat has stopped biting at herself and pulling out tufts of fur to try to relieve the itching, so we’re taking that as a sign the omega 3 supplement is helping. But we’ll  let her keep the fish oil all to herself.

Talking rationally when health care goes wrong

What’s the best way to talk about it when something goes wrong with patient care?

The Minnesota Department of Health this week released its annual report on “never” events that occurred at Minnesota hospitals and surgery centers over the past year, and as usual I had an internal debate over whether to use the term “adverse event” or “medical error” or, simply, “mistake.”

You might ask, “What’s the difference?”

But if I’ve learned anything from a decade’s worth of reading and studying the yearly adverse health events report, it’s that none of this is as straightforward as it seems.

The incidents catalogued in Minnesota’s adverse health events report represent the most serious things that go wrong in hospitals – wrong-site procedures, advanced-stage pressure ulcers, patients who die by suicide while in the hospital, serious medication errors and so forth. By definition these are considered “never” events – events that should never, or only rarely, happen in the hospital.

The world of patient safety is a huge ecosystem, however, with many layers that are less easy to categorize. Does an error lie behind every bad event or is it more complicated than this? Those who work in patient safety would say to beware of oversimplification.

Not all adverse patient events are directly the result of a mistake. Not all mistakes lead to adverse events.

For that matter, not all adverse events end in harm, although plenty do and patients can be seriously disabled or even die as a result.

Sometimes it’s the individuals delivering the care who are at fault. More often, it’s the system itself that’s faulty and vulnerable and sets people up, patients and health care professionals alike, for something to go wrong.

And sometimes, in spite of everyone’s best efforts, in spite of doing all the steps correctly, things just don’t go well.

Teasing out these nuances is one of the challenges in patient safety, especially when it comes to how the public perceives and talks about patient safety. It’s still difficult for hospitals to speak openly about adverse events, partly because it’s painful to do so but also partly because of the barrage of blame and judgment that’s likely to be unleashed.

This isn’t to say providers are entitled to a free pass whenever a patient is harmed. Accountability is necessary, always. But there’s a difference between holding people accountable and being harshly punitive. When the energy is focused on blame, the attention can be deflected away from learning what went wrong, why it happened and how it can be prevented from happening again.

Because, in the end, isn’t that what everyone wants? To learn, so the vulnerabilities can be fixed and future patients are less likely to have something go wrong with their care.

One of the big lessons from 10 years of experience in Minnesota is that reporting and open discussion about adverse events is making health care measurably safer. At times the progress has been achingly slow and at times it has gone backwards, but the overall trajectory has been in the direction of improvement.

This doesn’t happen, though, unless there’s a rational conversation about it. At some point we all need to make it less scary for people to ‘fess up so the real work of learning and improving and making care safer can take place.

Read the Minnesota Department of Health 2014 report on adverse events here. A 10-year evaluation of the adverse health events reporting system can be found here.

The language of health care

Awhile back, I wandered into an Internet discussion about the use of the word “non-compliant” to describe patients who didn’t follow the doctor’s instructions or take their medications as directed.

Don’t use that word anymore; the correct and less judgmental term is “non-adherent,” one person wrote.

A home health nurse shot back: She’d been using the term “non-compliant” for years, there was nothing wrong with it and she wasn’t about to change, thankyouverymuch.

Health care has a language of its own. Not just the technical words, although there are plenty of them. No, this language consists of ordinary words used in ways that convey certain shades of meaning. Sometimes patients are confused by it, sometimes they’re offended, and sometimes it subtly reinforces the paternalism and power imbalance that have historically existed between health care professionals and their patients.

Were your test results “negative”? Breathe a sigh of relief, because negative results in most cases are positive news for the patient. If the doctor notes that your symptoms are “unimpressive,” it’s not meant to be denigrating, it just means you’re having symptoms that aren’t severe or pronounced. That abbreviation in your chart that says “s.o.b.” doesn’t indicate what you think it does; it just denotes that you’re having shortness of breath.

As you venture farther into this terrain, it gets trickier. Why, for instance, do patients “deny” having chest pain or whatever other symptoms they don’t have? (One of my newsroom colleagues says that whenever she hears this term, she pictures a conversation that goes something like this: “You have chest pain!” “No, I don’t!” “Yes, you do!”)

What is an “incompetent” cervix, and what kind of value judgment does this term imply?

Does it matter what words we use? Are they just a neutral collection of letters and syllables with no deeper meaning? Or is the language of health care more than this?

As patients become more engaged in their care and the doctor-patient relationship becomes more of a partnership and less of a dictatorship, the language has correspondingly come under greater scrutiny. In this brave new world, do doctors issue “orders,” or do they provide “instructions”? Do they talk about collaboration or do they make all the decisions?

The whole debate over the use of the word “compliant” illustrates the extent to which the sands have been shifting. Although the term is still frequently used, critics point to its connotations: obedient, submissive, acquiescent, yielding, docile. In short, not the words many people would like to see associated with the doctor-patient relationship. The term is increasingly being replaced by “adherent,” which sounds vaguely like it might have something to do with duct tape but at least doesn’t appear to carry some of the same emotional baggage as its predecessor.

It might seem like political correctness run amok. But words often do matter, and they can influence our thinking in ways we might not realize. For instance, it’s common to record the reason for the patient’s visit to the doctor as the “chief complaint” - a rather psychologically loaded term, since the patient might not literally be complaining and might in fact be reluctant to complain in the truest sense of the word. Does the use of this word create subtle attitudes about patients and perhaps consign them to an unwanted role?

What about the common practice of referring to the patient’s current history of ailments, issues and chronic conditions as a “problem list”? Does this invite doctors and nurses to view patients as a collection of problems in need of fixing? Does the focus on “problems” make them overlook other aspects of the patient’s health in which he or she is doing well?

What does it mean when the patient “didn’t tolerate the procedure”, and whose fault does this imply?

I’m not sure whether these terms ought to be replaced, or what we could replace them with. There’s no vocabulary police in health care, after all, and no real consensus on what the best words should be. In any case, it’s a moving target. The debate over “compliant” vs. “adherent” isn’t going to go away any time soon, and within a few more years the favored term could drop out of favor in exchange for something better.

Occasionally, however, attention to language results in real change. At one time, patients who were in pain were said to “complain of pain.” It’s a phrase that can be freighted with emotion and value judgment, however, and health care providers began to realize that many patients with treatable pain weren’t speaking up because they didn’t want to appear to be complaining. With heightened awareness of this issue, providers began using the term “reporting pain” instead. This more neutral language is now so widespread that it has become rare, at least in the professional literature, to find references anymore to patients “complaining of pain.”

And occasionally patients themselves put the medical community on notice that some language is unacceptable. Karen Parles, executive director of the Lung Cancer Online Foundation, fired off a letter a few years ago to oncology specialists about the commonly used phrase “the patient failed chemotherapy.” She wrote:

Have the patients really “failed” when chemotherapy drugs do not work? Of course they haven’t. So why use a phrase that implies blame?

… This unfortunate convention is used in the medical literature, at professional conferences, and not surprisingly, in the clinic. It is common for oncologists to tell patients that they “failed drug X.” By telling patients they failed to respond to treatment, doctors may increase the guilt that many patients already struggle with as a result of their cancer diagnoses. For others, like me, it becomes an annoying refrain. At minimum, it puts emotional distance between doctor and patient and undermines the doctor-patient relationship. Just imagine under the same circumstances if the patient said to the doctor, “You failed to give me the right drug to treat my cancer.” The question isn’t who failed, but what failed.

Dr. Bruce Chabner, the editor-in-chief of The Oncologist, responded with an apology and a thank-you to Parles for pointing out “the all-important use and impact of our words.”

“I assure her that I have expunged ‘that phrase’ from my vernacular… and I urge my colleagues to do likewise,” he said.

This post originally appeared on Nov. 4, 2009.