Saying no to the scale

She didn’t want to be weighed in front of her entire gym class, so 13-year-old Ireland Hobart Hoch said no. But instead of respect for her decision, what she got was a visit to the principal’s office.

Her response to all the tsk-tsking, as reported in the Des Moines (Iowa) Register: “I really wasn’t comfortable with anybody but my mom and doctor knowing my weight. For another person to know – that’s not important to them.”

Teen health experts, school officials and gym teachers may beg to differ. The curriculum director at Ireland’s school suggested the real issue was with the student.  ”I think there are some body image issues with this girl,” she told the Register.

But my reaction is one of admiration for this young woman for taking a stand. How many adults would enjoy getting on a scale in front of their peers? Indeed, how many of us can truly say we look forward to the scale at the doctor’s office, where presumably there’s a valid health-related reason for being weighed?

Responses from the Des Moines Register’s readers were mostly in support of Ireland. Many commenters dredged up tales of their own youthful humiliation and body anxiety. The consensus seemed to be that a) there’s little benefit to weighing students during gym class; and b) the school district overstepped its authority.

Which prompts a larger question: How critical is it to submit to the scale, anyway?

At times it’s critical indeed. Anesthesia, for example, is dosed according to weight. The same goes for many chemotherapy regimens for treating cancer. Someone with congestive heart failure needs to be weighed often to monitor for potentially serious accumulation of fluid. A history of an eating disorder, recovery from weight-loss surgery, monitoring for medication side effects – any of these can be a valid medical reason for asking the patient to get on the scale.

A good case can be made that even for normal, relatively healthy people, charting the ups and downs of their weight over time is a valuable piece of their personal health information – and one that can provide an early warning sign of looming problems.

If only so many of us didn’t dislike it so much.

A few years ago, the New York Times reported on a University of Pennsylvania study that found many women were so uncomfortable with the prospect of facing the scale, they stayed away from the doctor’s office altogether.

Commenters had plenty of their own experiences to share. One person stepped onto the scale only to have the nurse blurt out, “I see you’ve been eating too much.” Another woman noted that although the number on the scale is just a number, our increasingly size-conscious society has attached so much moral meaning to the body mass index that weight has become a form of judgment. “I try not to feel ‘judged’ but it’s so hard not to, with every other health-related news story out there being ‘OMG! You are 5 pounds overweight! You are going to die soon, if not today, you gross gross fatty!’” she wrote.

Lest readers think this is only an issue related to obesity, anxiety over the scale also can be extremely triggering for many people who have an eating disorder or are recovering from one.

Apparently Ireland Hobart Hoch wasn’t the only student at her school who was bothered by the gym class weight-in. She told the Des Moines Register that classmates confided to her that they didn’t want to be weighed either. “I don’t want them to feel uncomfortable when they get weighed. I just feel really strongly about it,” she said.

To their credit, officials at her school suspended the weight screenings after she complained. But it makes one wonder why it had to come to this. If being weighed creates so much angst among so many people, what does this say about American cultural attitudes towards weight? Why not find ways to make the process less intimidating – relocating the scale from a public hallway at the clinic, say, or training staff to avoid nagging and/or thoughtless comments?

Maybe it’s time to step back and take a deep collective breath, suggests registered dietitian Bonnie Taub-Dix. Getting weighed “should be a confirmation, not a reprimand,” she said.

And people always have the right to say no.

Aging parents, adult children: the awareness gap

Adult children visit their aging parents for the holidays and discover a refrigerator full of spoiled food, stacks of unopened mail and a parent who seems alarmingly frail or confused. A closer look at the finances reveals unpaid bills or, worse yet, a bank account depleted by scammers.

In online forums for caregivers dealing with dementia, examples of this are abundant. One person describes a house overflowing with clutter and aged parents who were going up and down an unlighted basement stairway to do the laundry. Someone else relates her shock at seeing her father’s house fall into disrepair as he became more disabled.

Can adult children be so disconnected or so deep in denial that they don’t see what’s happening to their own parents?

Sometimes that’s the case. But let’s not be too quick to judge, because there can be a considerable gap between how older adults present themselves to the world and how they truly fare on a day-to-day basis.

“As the child who lived close by when my parents were still in their home and having siblings who lived far away, there may be a lot more going on in their home than you may be aware of,” wrote a member of an online Alzheimer’s forum.

This family felt their father’s cognitive skills were fine. But “there was a lot going on behind the scenes with them that they hid well,” she wrote.

In fact, it’s far from unusual for older adults to hide their struggles with health, cognition and daily life from those around them. And it’s far from unusual for adult children or other relatives to have difficulty gauging the severity of the situation or knowing when to take action.

Carolyn Rosenblatt, a contributor to Forbes who writes about healthy aging and caregiving, describes a typical scenario:

The signs are subtle at first. The brain-destroying disease that creeps up unannounced and steals your loved one comes in disguise. “Maybe he’s just getting old”, you tell yourself.

Your aging parent may have noticed being unable to remember things for some time. Dad will compensate by changing the subject, or finding some other words to replace the ones he can’t find. But he might just stop in the middle of a sentence.  He works at covering up the problem.

Mom will insist she’s fine. She knows she isn’t but doesn’t want you to find out. She’ll do anything to keep her memory loss a secret. She fears you’ll put her in a home. To her, that’s a death sentence.

When the adult children live far away and can’t – or don’t – visit their parents often, which is increasingly the case for many families, it becomes even more challenging for them to accurately assess the situation.

The issue can be compounded by reluctance or refusal to talk about what’s really going on, leading to significant gaps in knowledge and, by extension, readiness for caregiving. When The Boomer Project conducted a survey for Home Instead Senior Care a few years ago, only about one in 10 of the adult children who responded were realistic about the likelihood of being thrust unexpectedly into caring for their parents.

Nor were they well informed about their parents’ health. Only about half were knowledgeable about their parents’ medical conditions or could name any of the medications their parents took. Forty percent didn’t know the name of their parents’ primary care doctor or whether their parents even had a primary care doctor.

At this time of year, providers of services to older adults start receiving a barrage of phone calls from worried adult children who come home for the holidays and discover their parents aren’t doing as well as they thought.

Older adults and their grown children may think that as long as everything seems OK, there’s no need to plan for the future. But this mindset can lead to a family crisis, experts say.

Waiting for “the right time” to have a honest, respectful discussion with aging parents about their needs won’t cut it, because it can be too late sooner than you think, Rosenblatt writes. “You don’t want to be the one lulled into a false sense of security because no one has diagnosed your aging parent with a specific form of dementia. It doesn’t matter. Trust your own eyes and ears. If your gut tells you there’s something wrong here with your loved one, there probably is something wrong.”

Further resources:

Tips for starting sensitive conversations

Recognizing the difference between ordinary forgetfulness and signs of dementia

Lying to the doctor

One of the cardinal rules for patients is “Thou shalt not lie to the doctor.”

But people do it anyway, and deception seems to be more prevalent than most of us might think. Software Advice, a practice management systems consultancy, recently surveyed 3,075 patients and learned that half of them have deliberately withheld information from a doctor or other health care professional at least some of the time.

The percentage is probably even higher if we include unintentional withholding of information – the details that patients omit because they think it’s unrelated or not worth mentioning.

The survey respondents were most likely to be untruthful about drug, alcohol and tobacco use. Other areas prone to deception included eating habits, sexual activity, current medications and use of complementary and alternative therapies.

That wasn’t all, though. Almost 40 percent of those who took the survey admitted to minimizing their symptoms, their current or past medical history, or their health-related behavior. About one-fourth deceived by exaggerating, and another one in four said there were times when they didn’t disclose the information at all.

Why are patients sometimes less than forthcoming? The top reason, according to the survey, was fear of being lectured or feeling embarrassed. Patients also feared a loss of privacy if they disclosed something sensitive. And some of the time they simply wanted to avoid the cost or inconvenience of treatment.

The survey participants said they were more likely to be prodded into honesty if they had assurance that the information they disclose would be kept confidential. Knowing that the doctor or nurse wouldn’t judge them also was important.

It also seemed to help when patients had more information about the consequences of dishonesty – for example, the possibility of a misdiagnosis or harmful drug interaction. In addition, patients wanted assurance that they would be helped, and a small minority felt they would be more truthful if the doctor gently confronted them over a perceived lie.

There’s plenty of research to confirm that patients aren’t always forthcoming. For instance, a poll conducted a couple of years ago by Legacy, a national public health organization, found that one in 10 people didn’t disclose their tobacco use – and that the more stigmatized they felt, the less likely they were to tell the doctor or nurse that they smoked. Another study found that patients often don’t self-report symptoms of depression.

Most doctors are familiar with patients’ tendency to sometimes sand off the truth. A popular rule of thumb that many doctors learn during their training is to automatically double the amount of alcohol that patients say they consume; someone who reports having two drinks a day is really having four drinks a day.

Not all patients lie, however, and a handful of studies suggest that doctors may overestimate how often this happens. For that matter, doctors aren’t always truthful with patients either – but that’s a topic for another day.

Dishonesty in the doctor-patient relationship, even if the patient thinks it’s just a little white lie, rarely serves the patient’s best interests. At best, a health problem that’s minimized or not reported at all won’t get better. In the worst-case scenario, patients can end up being harmed because of inaccurate or incomplete information.

The easy answer would be to tell patients to get over their fears and embarrassment and just tell the truth. But count on it, for many people this isn’t enough. The survey by Software Advice concluded that if doctors want their patients to be honest, they need to make an effort of their own to create a trusting, non-threatening environment that encourages patients to open up: “Asking informed follow-up questions, maintaining eye contact, avoiding lecturing and clearly explaining confidentiality laws can go a long way towards improving communication and trust with patients.”

But… what did the patient want?

It’s truly a dilemma for the doctor. The patient’s test results are back and the news isn’t good. But it’s Friday afternoon and there’s a decision to make: Call the patient now or wait until Monday?

Dr. Lucy Hornstein, a family practice physician who blogs at Musings of a Dinosaur, wrestled recently with this question: “Here’s how I looked at it: either the patient got to spend the weekend not knowing, or knowing the worst but not being able to do anything about it for three long days. I elected not to ruin the weekend.”

If she had been on the receiving end of bad news, her own preference would have been to wait until Monday, she reasons. But she still wonders if she made the right decision for her patient, and asks her readers, “What would you have done?”

Some said it was the right call. Others weren’t so sure, however. One of the commenters summed it up this way: “At first I thought I’d do what you did, but then I thought that the decision probably needs to be based on who the patient is and how that person deals with things.”

Bingo!

One of the areas in which patient-centered care often stumbles is exactly these situations. We know what we would want so we conclude – no, wait, we assume – it’s probably what the patient wants too.

Applying our own filter to other people’s likes, dislikes, preferences and values is common to just about everyone. Most of the time we do it without ill intentions; it’s purely automatic.

I’d submit, however, that it’s an issue when it comes to something as personal as people’s health decisions.

At the core of shared decision-making is putting the patient’s values first. This sounds straightforward. In the real world, though, there’s often a huge leap from describing what shared decision-making should look like to actually making it happen.

I like the definition that appeared awhile back in Virtual Mentor, the ethics journal of the American Medical Association: “Shared decision making is an active dialogue between physician and patient with the goal of arriving at mutual understanding and agreement on a treatment plan.” (emphasis added)

It occurs to me that mutual understanding starts by getting a handle on how patients make medical decisions and what their preferences are for receiving information.

Do they want the unvarnished truth or do they want the discussion to be softer? How do they respond to scare tactics – are they motivated or are they turned off? Do they want to know everything or just the highlights? Do they want a high level of participation in their care or are they most comfortable with allowing the doctor to take the lead?

Doctors obviously can’t know any of this unless they ask. And since it seems unfair to expect them to explore (or remember) the patient’s preferences every time a decision has to be made, here’s a semi-radical proposal: Create a section within the medical record for the patient’s information and decision-making preferences – a values history, if you will, that documents key aspects that are important and/or helpful for the doctor to know, in the same way that social, occupational and family medical history are helpful for doctors to know.

Would this need to be a standardized, evidence-based tool? Not necessarily. Individual doctors or individual practices could develop something that works for them. Nor would it have to be lengthy or onerous, since getting to know the patient and his or her values is something that unfolds over time through a series of encounters and varying contexts, rather than as a single, static point of data collection.

It might be a good exercise for patients too, and could even help set the stage for better conversations about end-of-life care.

I can’t speak for Dr. Hornstein’s patient but I know what I would have wanted. I would have wanted to be told sooner rather than later, so I could have time to start absorbing the bad news and planning for what comes next.

I’m pretty sure I wouldn’t have faulted the doctor for waiting; she was motivated by the best of intentions. But think how much better this might have been if the patient’s preferences were known. The doctor wouldn’t have had to wrestle with the decision or second-guess herself, and the patient’s priorities would have been respected – no assumptions, no “this is what I’d want if it were me.” It would have been what the patient really did want, and it would have been shared decision-making.

Dinner at home: ideal vs. reality

According to the common nutritional wisdom, families who sit down together for home-cooked meals tend to be both healthier and happier, and research for the most part supports that this is true.

But when a group of sociologists decided to study what it really takes to prepare a family dinner, they learned that all is not well in the kitchen. In fact, moms reported feeling pressured to live up to unrealistic ideals and many felt the benefits of home-prepared food weren’t worth the hassle.

Utopia, meet Real Life.

Food gurus may romanticize about the love and skill that goes into preparing a meal and the appreciation with which it’s eaten, but “they fail to see all of the invisible labor that goes into planning, making and coordinating family meals, ” the researchers concluded. “Cooking is at times joyful, but it is also filled with time pressures, tradeoffs designed to save money, and the burden of pleasing others.”

For their study, aptly titled “The Joy of Cooking?”, they spent a year and a half conducting in-depth interviews with a social and economic cross-section of 150 black, white and Latina mothers. They also spent more than 250 hours observing poor and working-class families as they shopped for groceries, cooked and ate meals.

They found mothers were strapped for time, sometimes working two jobs and unpredictable hours to make ends meet and with little energy left over to plan a meal, prepare it and clean up the kitchen afterwards while their children clamored for attention. “If it was up to me, I wouldn’t cook,” one mother bluntly told the researchers.

They discovered that in most of the poorer households they observed, mothers routinely did their own cooking to save money. But these women often were disadvantaged by kitchens that were too small and inadequately equipped – not enough counter space, a shortage of pots and pans, lack of sharp knives and cutting boards, and so on. One family living in a motel room prepared all their meals in a microwave and washed their dishes in the bathroom sink.

A common barrier was the cost of fresh fruit, vegetables, whole grains and lean meats, typical ingredients of a healthy meal. Some of the mothers didn’t have reliable transportation so they only shopped for groceries once a month, which limited the amount of fresh produce they could buy. Even in the middle-class households, moms fretted that the cost of quality food forced them to buy more processed foods and less organic food than they wished.

The final straw: family members who fussed, picked at the food or refused to eat what was served. The researchers rarely observed a family meal during which no one complained. In many of the low-income homes, moms resorted to making the same foods over and over rather than try something new that might be rejected and go to waste. For middle-class moms, the pressure to put healthy, balanced meals on the table often led to considerable anxiety over what they cooked and served.

Despite all this, is it possible for families to consistently prepare good meals at home and actually enjoy the experience instead of viewing it as a chore? Of course it is. But for many households, getting there clearly will be a slog.

When the reality surrounding the home-cooked meal is often at odds with the ideal, why then do food system reformers insist that the revolution needs to happen in the household kitchen? the researchers wonder.

They call the emerging standard for the home-cooked meal “a tasty illusion, one that is moralistic, and rather elitist, instead of a realistic vision of cooking today. Intentionally or not, it places the burden of a healthy, home-cooked meal on women.”

Perhaps we need more options for feeding families, such as introducing healthy food trucks or monthly town suppers, or getting schools and workplaces involved in sharing healthy lunches, they suggest. “Without creative solutions like these, suggesting that we return to the kitchen en masse will do little more than increase the burden so many women already bear.”

DIY blood pressure management

What would happen if patients with high blood pressure were allowed to manage the condition with minimal involvement by a doctor?

A recent study gives support to the notion that when patients are given the opportunity to take charge of their care, many of them will successfully rise to the occasion.

OK, that’s reaching beyond what the study actually addressed. (Here’s the link to the Journal of the American Medical Association, where the research was published a couple of weeks ago.) But it offers an intriguing look at allowing patients a more active role in managing a chronic condition.

Short summary of the study: Researchers compared two groups of patients, one that received standard care for high blood pressure and one that monitored their blood pressure at home and self-adjusted their medication as necessary. At the end of one year, the self-monitoring patients achieved an average blood pressure reading of 128 over 74, while the group that received the usual care – visiting the doctor’s office at regular intervals to have their blood pressure checked and their medications reviewed and adjusted – had an average of 138 over 76. (High blood pressure is defined as consistent readings of 140/90 or higher.)

The researchers primarily wanted to find out whether the do-it-yourself approach is effective in managing hypertension, a significant risk factor for stroke, cardiovascular disease and kidney disease. In the U.S., about one in three adults has high blood pressure and only about half have it under control.

Based on the study’s results, self-management seems to be feasible, at least for some patients, in reaching the clinical goal of a healthy blood pressure.

There are a ton of questions that still need to be answered. Which patients might be the best candidates for a greater role in managing their blood pressure? What kind of education or support do they need for taking on this responsibility? Can patients continue to self-manage their blood pressure over a period of years or will adherence drop off after a year or two?

What really caught my attention, though, was the evidence that patients don’t invariably fail when they’re given more responsibility. Moreover, it’s possible for them to achieve good results. (It should be noted that the self-managing patients in the JAMA study weren’t entirely on their own; any adjustments they made in their blood pressure medication were part of an agreed-on plan with their doctor.)

Helping patients reach optimal blood pressure is one of the most common tasks carried out by primary care doctors. As you might guess, it consumes a great deal of time and resources. If patients who are capable of self-managing their blood pressure are given more latitude to do so, maybe it would free up the doctor’s time to concentrate on the more challenging cases and everyone would gain something.

Rich, poor, healthy, sick

Take two people, one with a college degree and earning $100,000 a year and the other with a high school diploma earning $15,000, and guess which one is likely to have better health.

Those who study population health have long known that, elitist as it sounds, income and education are two of the strongest predictors of overall health in the United States. Americans who are educated and financially secure tend to live longer. They’re more likely to be physically active and less likely to use tobacco. Often they have better health outcomes and better management of chronic conditions such as diabetes or high blood pressure.

Exactly why this would be the case is not clearly understood. One assumption is that it’s all about access – people with more money are able to afford better food, live in safe neighborhoods and receive adequate medical care. Another assumption is that it has to do with healthy or unhealthy community environments. But an interesting review from a few years back indicates it’s much more complex than this.

The authors identify some influences that are familiar. Having more money, for example, means that people have the resources to join a health club or buy fresh fruits and vegetables. Where people live can shape how they behave – whether they have access to parks and sidewalks or the amount of tobacco and fast-food advertising they’re exposed to.

But the authors also identify several factors that are psychological, social and more subtle to tease out.

Education and efficacy. One of the functions of education is to teach critical thinking, problem-solving and self-discipline, which can better equip people to process health information and apply it to their lives. These skills can also make them feel more confident about their ability to successfully manage their health.

- Peer group identification. People tend to associate with their own socioeconomic group and usually will adopt similar norms that reinforce their place in the social ecosystem. If everyone else in your educated, professional social circle is a nonsmoker, chances are you won’t smoke either, or will make serious attempts to quit. Likewise, blue-collar individuals may smoke to show independence, toughness and solidarity with their social group.

- Optimism about the future. Lower lifetime earnings and a sense of limited opportunity can make lower-income people feel there’s less reason to invest in their long-term health. Their health decisions can be more focused on the here and now than on the future. The authors of the review also suggest that the higher levels of stress associated with being economically disadvantaged can lead people to use tobacco, alcohol and/or eating as a way of coping.

Who remembers seeing the headlines this past month about a study that found a link between saving for retirement and being in better health? The researchers may have been onto something, namely that planning for one’s retirement could be just one of many markers for the psychosocial factors that influence health – disposable income, self-efficacy, peer group norms, belief in the future and so on.

Money does indeed make a difference but it isn’t just about money, the authors explain in their review. Walking as a form of exercise costs nothing, while smoking can be an expensive habit. What motivates someone to choose one over the other?

This is only scratching the surface, of course. Many of these factors are interrelated – for example, someone at a lower socioeconomic level who is motivated to adopt healthy habits but has difficulty achieving this because of a lack of means. And it’s hard to outsmart your genetic background regardless of your income or education level or motivation to pursue a healthy lifestyle.

There’s a huge national conversation taking place about being healthy: what it is, how to achieve it and how to reduce some of the obvious income and racial disparities. Do we just keep urging everyone to “make better choices”? Do we legislate? It’s clear from all the research that the social and psychological factors surrounding health-related behavior are complex and not easy to untangle. If ever there was an area to resist simplistic solutions, this is it.

‘Looks older than stated age’

Pity the young, pretty blonde doctor who’s constantly mistaken for being less accomplished than she truly is.

“Sexism is alive and well in medicine,” Dr. Elizabeth Horn lamented in a guest post this week at Kevin, MD, wherein she describes donning glasses and flat heels in an attempt to make people take her more seriously.

As someone who used to be mistaken for a college student well into my mid-20s, I certainly feel her pain. But let’s be fair: Doctors judge patients all the time on the basis of how old they appear to be.

It’s a longstanding practice in medicine to note in the chart whether adult patients appear to be older, younger or consistent with their stated age. Doctors defend it as a necessary piece of information that helps them discern the patient’s health status and the presence of any chronic diseases.

According to theory, patients who look older than their stated age are more likely to have poorer health, while those who look more youthful than their years are in better health. But does it have any basis in reality? Well, only slightly.

An interesting study was published a few years ago that examined this question. The researchers found that patients had to look at least 10 years older than their actual age for this to be a somewhat reliable indication of poor health. Beyond this, it didn’t have much value in helping doctors sort out their healthy patients at a glance. In fact, it turned out to have virtually no value in assessing the health of patients who looked their age.

Other studies – and there are only a few that have explored this issue – have come up with conflicting results but no clear consensus, other than the conclusion that judging someone’s apparent age is a subjective undertaking.

When there’s such limited evidence-based support for the usefulness of noting the patient’s apparent age, then why does the habit persist?

I’ve scoured the literature and can’t find a good answer. My best guess is that doctors are trained to constantly be on the lookout for risk factors – which patient is a heart attack waiting to happen, which one can’t safely be allowed to take a narcotic, which one is habitually non-adherent – and assessing apparent age vs. actual age is one more tool they think will help, a tool they may have learned during their training and continued to use without ever questioning its validity.

Appearances can be deceiving, however. A patient who looks their age or younger can still be sick. Someone who looks older can still be relatively hale and hearty.

And beware the eye-of-the-beholder effect. One of the studies that looked at this issue found that younger health care professionals consistently tended to overestimate the age of older adults. When you’re 30, everyone over the age of 60 looks like they’re 80, I guess.

Whether you’re a young physician fighting for the respect your training commands or a patient fighting against assumptions in the exam room, the message is the same: You can’t judge a book by its cover.

When you just can’t sleep

Reservations about the safety of prescription sleeping pills have been around for a long time. But recent new research has raised fresh concerns about when they’re appropriate and who’s most at risk.

To summarize: A study by the U.S. Centers for Disease Control and Prevention and Johns Hopkins University found that psychiatric medications – a category that includes sedatives – account for thousands of emergency room visits in the U.S. each year. One of the key findings, which may have come as somewhat of a surprise to the public, was that zolpidem, or Ambien, was implicated in 90,000 emergency room visits annually for adverse drug reactions.

The majority of ER visits for drug reactions associated with sedatives and anti-anxiety medications were among adults in their 20s, 30s and 40s. But among older adults who were taking these medications and ended up in the ER, the consequences were often more severe and were more likely to result in hospitalization.

This could be an opportunity to address adverse drug events, or emergency room utilization, or prescription drug use, or medication use by older adults. But I’m not going there, at least this time.

If I ruled the world, we would have a long-overdue national conversation about sleep and insomnia.

We’d open with a discussion of the “sleep is for wimps” mindset. Where does this come from, and who do these people think they’re kidding?

We’d take a look at the science. What do we know about the human body’s need for sleep and the mechanisms of sleep? How many questions still lack good answers?

We’d involve the medical community. How often are patients queried about their sleep? Is there more than one option for helping them, or is the immediate response to hand out (or refuse) a prescription for a hypnotic or to assume the problem is related to stress or lifestyle?

Finally, we’d get real about insomnia. Although sleep difficulties can often be traced to how people live their lives, simply telling them to practice better “sleep hygiene” may not cut it for those whose insomnia is longstanding, complex and more challenging to treat.

Somewhere in the discussion we might talk about shift work and the impact it has on sleep and health. We could talk about sleep apnea and restless legs syndrome as specific causes of poor sleep, while at the same time recognizing that many people with insomnia don’t have either of these conditions.

We could probably talk about the punishing levels of daily stress experienced by many people and how it interferes with their sleep.

And yes, we’d have a serious discussion about where pills fit into this. We would acknowledge that sleep aids are sometimes prescribed to people who don’t really need them or whose safety might be compromised by taking them. But if we’re being fair, we’d also have to recognize that clamping down on sleeping pill prescriptions could consign many people to chronic, intractable insomnia – and as anyone with longstanding insomnia can attest, it’s a miserable and ultimately unhealthy place to be.

Who’s up for the conversation?

When the patient becomes the doctor’s caretaker

In a video interview, the anonymous doctor’s frustration comes through loud and clear. She takes care of complex patients with many health needs, often working 11 or 12 hours a day, sacrificing time with her family. Yet the message she constantly gets from administrators is that she’s “dumb and inefficient” if she can’t crank patients through the system every 15 minutes.

In a word, she’s abused.

And patients ought to care enough about their doctors to ask them if they’re being abused, according to Dr. Pamela Wible, who raised the issue recently on her blog. “The life you save may save you,” wrote Dr. Wible, a primary care doctor on the West Coast who established her own version of the ideal medical practice after becoming burned-out by the corporate model of care.

This is one of those issues that’s like lifting a corner of the forbidden curtain. Many patients probably don’t think too much about their doctor’s challenges and frustrations. After all, physicians are paid more than enough to compensate for any workplace frustration, aren’t they? Isn’t this what they signed up for?

The problem with this kind of thinking is that it ignores reality. Medicine, especially primary care, has become a difficult, high-pressure environment to be in. One study, for example, that tracked the daily routine at a private practice found the physicians saw an average of 18 patients a day, made 23.7 phone calls, received 16.8 emails, processed 12.1 prescription refills and reviewed 19.5 laboratory reports, 11.1 imaging reports and 13.9 consultation reports.

And when physicians are overloaded, unhappy and feel taken advantage of, it tends to be only a matter of time before it spills over into how they interact with their patients.

The million-dollar question here is whether patients can – or should – do anything about it.

Dr. Wible advocates taking a “just ask” approach. Compassion and advocacy by patients for their doctors can accomplish far more than most people think, she says.

One of her blog readers agreed, saying the pressures “must frustrate them beyond endurance. I’m going to start asking.”

Another commenter sounded a note of caution, though: “I feel there is a risk for a patient to ask such a question to a dr. who might be hiding how very fragile he/she is.”

More doubts were voiced at Kevin MD, where Dr. Wible’s blog entry was cross-posted this week. A sample:

- “Abused is a very emotionally loaded word that brings up powerful emotions and feelings like shame. I think if a doc is asked by a patient whether he/she is abused, they might actually end up feeling accused.”

-  ”I’m having a hard time imagining most docs responding well to their patients asking them if they are abused and I doubt that most docs would respond ‘yes, I am being abused’ to patients who do ask that no matter what was going on in their workplace. Nor do I think most patients want to spend a big chunk of their doctor visit talking about the doctor’s problems and issues.”

- “And what could I do if the answer is ‘yes’?”

I’m not sure what to think. At its core, health care is a transaction between human beings that becomes most healing when all the parties are able to recognize each other’s humanity.

Yet reams have been written about doctor-patient boundaries and the hazards of too much self-disclosure by the physician. Can it ultimately damage the relationship if the doctor shows vulnerability or emotional neediness? What are the ethics of a role reversal that puts the patient in the position of being caretaker to the doctor?

What do readers think? I’d like to know.